Member since
05-07-2018
331
Posts
45
Kudos Received
35
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3497 | 09-12-2018 10:09 PM | |
1301 | 09-10-2018 02:07 PM | |
5798 | 09-08-2018 05:47 AM | |
1720 | 09-08-2018 12:05 AM | |
2460 | 08-15-2018 10:44 PM |
09-24-2018
01:41 PM
Hi @Abhinav Joshi! Sorry my late, so basically whenever you convert the template to YML it changes the UUID for the INPUT PORT? If so, that's kinda strange.. might be a bug. I'm not aware of this behaviour, unfortunately 😞 Did you try to use a major version of minifi? Hope this helps!
... View more
09-21-2018
01:30 PM
Glad to know that @Abhinav Joshi ! Regarding your question related to "why change manually the config.yml", I'd say it's because you had some kinda update in the flow (by dragging again the input port, or some other reason, hence the UUID for the specific input port changed). I'd kindly ask you, if you thought my answer helpful, please accept as an answer for your requirement. This will engage the other HCC users to keep doing a good job, hence the other HCC users will find the best answer quickly 🙂
... View more
09-18-2018
08:43 PM
Hi @Abhinav Joshi! Sorry, my late. Hm, that's strange, in this case, I'd try to enable the debug level for bootstrap in the logback.xml. Is it showing only this single message? Could you share the whole message, please? Thanks
... View more
09-17-2018
05:46 AM
@A C Just to understand, did you run the spark submit using yarn cluster as master/deploy mode? If so, let's try to check the job properties for the following parameter: ${resourceManager} Also, here it is another example regarding pyspark + oozie (using shell to submit spark). https://github.com/hgrif/oozie-pyspark-workflow Hope this helps
... View more
09-16-2018
05:22 AM
Hi @Abhinav Joshi! Glad to know that I could be helpful. So regarding your issue, this is strange.. Well, does it happen the same if you change the ID for the input ports manually (by changing it directly the config.yml)? Hope this helps
... View more
09-16-2018
05:01 AM
1 Kudo
Quick question, does it work running outside of oozie? E.g. using directly the spark-submit.
... View more
09-14-2018
02:05 PM
Hello @Abhinav Joshi! Yes, change it in all references for this port. E.g. in my config.yml I've 3 ref's to InputPort id vmurakami$ cat config.yml | grep -i bcfad0b8-3cdb-15b8-0000-00004b94b9cf
name: FetchFile/success/bcfad0b8-3cdb-15b8-0000-00004b94b9cf
destination id: bcfad0b8-3cdb-15b8-0000-00004b94b9cf
- id: bcfad0b8-3cdb-15b8-0000-00004b94b9cf After that, restart minifi. Hope this helps
... View more
09-14-2018
01:08 AM
Hello @n c! Would be like this: insert into testtbl select * from(select named_struct('tid','id21314','action','some action','createdts',current_timestamp()), '232402909', '123091203910') as dummy;
Hope this helps.
... View more
09-13-2018
10:51 PM
Hello @Cody kamat! Could you run an explain in one of your queries and share with us the result? btw, a couple of questions related your queries: - Are you using TEZ? Or LLAP? - Are you using mapJoin? Vertex enable? - Is CBO enable as well? - Your tables has partitions/buckets? - Are you using any fileformat like (parquet, avro, orc)? Thanks
... View more
09-13-2018
10:24 PM
Hello @Kapil Kaushik! Did you check the path for the flow file of this specific node? If not, could you check with the following steps? #First identify where is the location of your nifi.properties being used by nifi process
[root@node2 ~]# ps -ef | grep -i nifi.properties
nifi 1319462 1319414 4 Sep07 ? 06:23:51 /usr/jdk64/jdk1.8.0_112/bin/java -classpath /usr/hdf/current/nifi/conf:/usr/hdf/current/nifi/lib/javax.servlet-api-3.1.0.jar:/usr/hdf/current/nifi/lib/jcl-over-slf4j-1.7.25.jar:/usr/hdf/current/nifi/lib/nifi-nar-utils-1.5.0.3.1.2.0-7.jar:/usr/hdf/current/nifi/lib/jetty-schemas-3.1.jar:/usr/hdf/current/nifi/lib/jul-to-slf4j-1.7.25.jar:/usr/hdf/current/nifi/lib/log4j-over-slf4j-1.7.25.jar:/usr/hdf/current/nifi/lib/nifi-properties-1.5.0.3.1.2.0-7.jar:/usr/hdf/current/nifi/lib/logback-classic-1.2.3.jar:/usr/hdf/current/nifi/lib/logback-core-1.2.3.jar:/usr/hdf/current/nifi/lib/nifi-api-1.5.0.3.1.2.0-7.jar:/usr/hdf/current/nifi/lib/nifi-framework-api-1.5.0.3.1.2.0-7.jar:/usr/hdf/current/nifi/lib/slf4j-api-1.7.25.jar:/usr/hdf/current/nifi/lib/nifi-runtime-1.5.0.3.1.2.0-7.jar -Dorg.apache.jasper.compiler.disablejsr199=true -Xmx512m -Xms512m -Dambari.application.id=nifi -Dambari.metrics.collector.url=http://node4:6188/ws/v1/timeline/metrics -Djavax.security.auth.useSubjectCredsOnly=true -Djava.security.egd=file:/dev/urandom -Dsun.net.http.allowRestrictedHeaders=true -Djava.net.preferIPv4Stack=true -Djava.awt.headless=true -XX:+UseG1GC -Djava.protocol.handler.pkgs=sun.net.www.protocol -Dnifi.properties.file.path=/usr/hdf/current/nifi/conf/nifi.properties -Dnifi.bootstrap.listen.port=45115 -Dapp=NiFi -Dorg.apache.nifi.bootstrap.config.log.dir=/var/log/nifi org.apache.nifi.NiFi -K /usr/hdf/current/nifi/conf/sensitive.key
#Then let's check the config which says where's the flow.xml.gz should be
[root@node2 ~]# cat /usr/hdf/current/nifi/conf/nifi.properties | grep -i flow.xml
nifi.flow.configuration.file=/var/lib/nifi/conf/flow.xml.gz After getting the path used for the flow.xml, try to replace it with your healthy flow.xml.gz. Hope this helps!
... View more
09-13-2018
10:16 PM
Hello @Teresa Tavernelli! Glad to know that you made it 🙂 So regarding the Mysql, you can give it a shot with the following parameters: hostname -> your-sandbox-hostname
port -> 3306
user -> root ps: I didn't test it myself Also if it doesn't work, try with the hive configs for mysql (instead of using username as root, change it to hive) Hope this helps!
... View more
09-13-2018
10:09 PM
Hello @n c! Afaik you can use Avro with binary encoded since this binary encoded content has avro compatibility. To further details, take a look at this link: https://www.michael-noll.com/blog/2013/03/17/reading-and-writing-avro-files-from-the-command-line/ You can give it a shot with this avro tools, to figure if your binary date coming from Kafka/Flume has avro schema embedded into the data. Lastly, take a look at this link, it says which encoding Avro accepts as data. https://avro.apache.org/docs/1.8.1/spec.html#Encodings Hope this helps
... View more
09-13-2018
09:57 PM
Hello @PPR Reddy. Did you try to escape by passing it's hexadecimal value? https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_large_objects I didn't give it a shot, but I'd say would be like this: sqoop import <YOUR ORACLE CONN STRING> --fields-terminated-by \0xD --the representation of ^M in hexdecimal Hope this helps!
... View more
09-12-2018
10:09 PM
Hello @Abhinav Joshi! Quick question, your INPUT PORT on the Nifi flow it's up? Btw, I saw once this error from MiNifi when I changed my Input port in the Nifi flow and forgot to update the config.yml with the new ID from the Input Port. Hope this helps
... View more
09-12-2018
05:24 PM
Excellent explanation Matt!
... View more
09-12-2018
03:46 PM
Also, found this link: https://help.pentaho.com/Documentation/7.1/0H0/Set_Up_Pentaho_to_Connect_to_a_Hortonworks_Cluster Not sure if it's gonna help you 🙂
... View more
09-12-2018
03:45 PM
Hi @Teresa Tavernelli! Could you try one thing? Test to change the localhost for the name of the sandbox machine (should be something like this: sandbox-hdp.hortonworks.com:10000). Since I'm not aware of Pentaho tools, do you've any config from Kettle where you can put the hive-site.xml (configs for Hive). Hope this helps!
... View more
09-12-2018
01:32 PM
1 Kudo
Hello @sandra Alvarez! I'm glad you made it 🙂 If you found this answer helpful, please I'd kindly ask you to accept this as the best answer. Doing this will engage other HCC users to keep doing a good job, and also will help others to find the answer faster. Thank you.
... View more
09-11-2018
10:11 PM
Hi @Marshal Tito! I'm glad you made it 🙂 So, regarding the aux jar, did you set this through ambari? If so, you should be able to use the jar without add. Otherwise, try to take these steps: https://community.hortonworks.com/content/supportkb/48734/how-to-permanently-add-custom-jar-files-to-hive-in.html Hope this helps.
... View more
09-11-2018
03:33 PM
Hello @Sai Krishna Makineni! I found a link with a similar error to yours https://community.hortonworks.com/content/supportkb/49525/receiving-error-javalangnullpointerexception-and-o.html Hope this helps
... View more
09-11-2018
03:12 PM
Hello @jessica Chen! Just to confirm, which language are you using in your OS? I'm asking this cause I've spot this link: https://community.hortonworks.com/questions/23409/there-is-a-problem-when-install-hdp-on-the-stepcon.html Hope this helps!
... View more
09-11-2018
03:01 PM
Hello @John King! Not sure if the issue persists, but I have a few questions related to your issue: - Are you running MiNifi as a service or agent (through bat) - If you're running through bat, are you running the cmd/PowerShell as administrator? Btw, could you enable the DEBUG mode for the TailFile? And share with us the logs? E.g. <logger name="org.apache.nifi.processors.standard.TailFile" level="DEBUG"/> Lastly, If you're able to make a quick test and install cygwin in the windows server, are you able to tail the file with the same user that's running minifi? Hope this helps
... View more
09-11-2018
02:00 PM
Hello @Thanuja Kularathna ! Did you try to add these properties in Custom HDFS-site? You can do it by going to: Ambari > HDFS > Config > Advanced > Custom hdfs-site > Go to the bottom of the Custom hdfs-site and Add Property And then save the new properties added and restart the service. Hope this helps!
... View more
09-11-2018
01:40 PM
Hello @Marshal Tito! Which message do you face if you enable the DEBUG? And also, just to confirm, do you see your jar added in the aux libs? hive -e "set hive.aux.jars.path;" Hope this helps.
... View more
09-11-2018
04:17 AM
Hello @Maxim Neaga! I'm glad that I could help you 🙂 Yeah, make totally sense your question. I'd say because you've 3 truststore's: - 1 x truststore's for the nifi nodes - 1 x For the Ranger x Nifi plugin - 1 x For Ranger Admin (which is the java default cacerts). Hope this helps!
... View more
09-10-2018
04:54 PM
Hi @A C. At first, glance, I can't see anything misconfig. Take a look at this article, to see if helps you on smtg: https://community.hortonworks.com/articles/84071/apache-ambari-workflow-manager-view-for-apache-ooz-2.html Hope this helps
... View more
09-10-2018
04:14 PM
Hi @A C. You're right, this is the YARN WEB UI 🙂 Hm, so from what I can see, it looks like yarn didn't launch your spark application. Do you mind to share with us your oozie workflow xml? Thanks.
... View more
09-10-2018
02:55 PM
Hmm, @tauqeer khan good question. I've never tested it myself. But at first, glance, guess you can take a look at the following answer: https://community.hortonworks.com/questions/182344/how-to-copy-data-from-a-hive-table-recurrently-usi.html And try to build something similar 🙂 BTW, it's a good idea to test it, let us know if it works or if you face anything. Hope this helps!
... View more
09-10-2018
02:26 PM
Hello @A C. What do you see in the YARN UI? Is there any application_id running for your oozie workflow/Spark Job? Thanks.
... View more