Member since
04-05-2016
130
Posts
93
Kudos Received
29
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3813 | 06-05-2018 01:00 AM | |
5158 | 04-10-2018 08:23 AM | |
5669 | 07-18-2017 02:16 AM | |
2933 | 07-11-2017 01:02 PM | |
3359 | 07-10-2017 02:10 AM |
10-18-2016
06:19 AM
1 Kudo
Hello, I followed the tutorial once again and found what caused this. The tutorial instructs to install NiFi via Ambari. Then the NiFi process is started by 'nifi' user on sandbox. On the other hand, it instructs to clone the 'iot-truck-streaming' project from github, under 'root' home directory (/root). So, even if we add execute permission to the sh file, nifi user can't access anything under /root directory. Copying /root/iot-truck-streaming dir to /home/nifi would be a work-around: # ssh into sandbox with root user, then:
cp -rp /root/iot-truck-streaming /home/nifi Then, specify: /home/nifi/iot-truck-streaming/stream-simulator/generate.sh as 'Command Arguments' on NiFi.
... View more
10-18-2016
12:19 AM
Hi @Mothilal marimuthu thanks for sharing that. I'll check it again to see if there's anything else needed.
... View more
10-17-2016
11:36 PM
Hi, I posted the comment without 'chmod +x' command above.. sorry about that. Please login to the sandbox machine by SSH and execute 'chmod +x /root/iot-truck-streaming/stream-simulator/generate.sh' command to add execution permission to the shell script file. Then NiFi will be able to run the sh from ExecuteProcess.
... View more
10-17-2016
08:53 AM
Hello, sorry for the delay. The language is JSONPath. JSONPath itself has filtering capabilities. Many examples can be found on this page. https://github.com/jayway/JsonPath
... View more
10-17-2016
07:31 AM
Hi, To add executable permission to the file to every user (since I am not sure which user runs NiFi process on your environment), use following command: chmod +x /root/iot-truck-streaming/stream-simulator/generate.sh You might need to add sudo at the beginning of the command based on who you are logged in with: sudo chmod +x /root/iot-truck-streaming/stream-simulator/generate.sh Hope it works. (Sorry, I pasted the above command wrong, updated.)
... View more
10-17-2016
07:11 AM
1 Kudo
Hello, Sorry to hear that you're having trouble with our tutorial. I think there must be some error happening when the ExecuteProcess runs the shell script. But the processor ignores the stderr stream of the process by default, and currently, no bulletin error or log message are shown, then it's difficult to investigate what went wrong. I think there's some room for improvement for this, and going to look further. In a meanwhile, please use 'Redirect Error Stream' to capture error output of the process to see what is happening, as shown in the attached image. Thanks!
... View more
10-13-2016
02:51 AM
3 Kudos
Hello, I think you need to use SplitText and SplitContent. SplitText can split lines, then pass each line to SplitContent, which can be configured delimiter by hexadecimal format as "Byte Sequence". Semicolon ";" is "3B". Hope this will work for you.
... View more
10-13-2016
02:41 AM
Another solution would be: SplitJson -> EvaluateJsonPath (store a on attribute) -> RouteOnAttribute -> MergeContent But if the desired output is JSON array, using EvaluateJsonPath is simpler.
... View more
10-13-2016
02:38 AM
2 Kudos
Hello, I hope I understand your question properly. If you want to split the example JSON into two flow-files, each contains an array of elements with a=1, or a=2, then following flow can do the job. The point is passing the same JSON flow-file into two EvaluateJsonPath processors, then each EvaluateJsonPath extract array based on its own interest, using JSON path like '$.[?(@.a=='2')]'. I've tested with an input JSON: [{"a":1, "i": 1},{"a":1, "i": 2},{"a":2, "i": 3},{"a":2, "i": 4}] Then, I got on left side: [{"a":1,"i":1},{"a":1,"i":2}] on right side: [{"a":2,"i":3},{"a":2,"i":4}] Hope this helps.
... View more
08-04-2016
01:54 AM
2 Kudos
Hi Saikrishna, I was able to connect GetHDFS from Apache NiFi (not HDF, but it should work with HDF, too) running on my local pc (a host machine), to HDP sandbox VM. However, I had to change VM network setting from NAT to Host-only Adapter. The reason why I had to do this, is, after NiFi communicates with HDFS Namenode (sandbox.hortonworks.com:8020), Namenode returns Datanode address, which is a private IP(10.0.2.15:50010) address of the VM, and with NAT, it can't be accessed from the host machine. I saw following exception in nifi log: 2016-08-04 10:11:38,106 WARN [Timer-Driven Process Thread-9] o.apache.hadoop.hdfs.BlockReaderFactory I/O error constructing remote block reader.
org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/10.0.2.15:50010]
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:533) ~[hadoop-common-2.6.2.jar:na]
Here are steps that I did: 1. Change Sandbox VM network from NAT to Host-only Adapter 2. Restart Sandbox VM 3. Login to the Sandbox VM and use ifconfig command to get its ip address, in my case 192.168.99.100 4. add an entry in /etc/hosts on my host machine, in my case: 192.168.99.100 sandbox.hortonworks.com 5. check connectivity by telnet: telnet sandbox.hortonworks.com 8020 6. Restart NiFi (HDF) I didn't have to change core-site.xml nor hdfs-site.xml. Copying those from sandbox, then setting those from processor configuration 'Hadoop Configuration Resources' should work fine. Hope this helps!
... View more
- « Previous
- Next »