Member since
04-05-2016
130
Posts
93
Kudos Received
29
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3813 | 06-05-2018 01:00 AM | |
5159 | 04-10-2018 08:23 AM | |
5669 | 07-18-2017 02:16 AM | |
2933 | 07-11-2017 01:02 PM | |
3359 | 07-10-2017 02:10 AM |
12-16-2016
07:45 AM
2 Kudos
Hello @Mohan V PutKafka is designed to work with Kafka version 0.8 series. If you're using Kafka 0.9, please use PublishKafka processor instead. Thanks, Koji
... View more
12-08-2016
02:18 AM
@Saikrishna Tarapareddy In that case, I'd use another command to list the sub folders first, then pass each sub folder to a zip command. NiFi flow looks like below. List Sub Folders (ExecuteProcess): I used find command here. find souce-files -type d -depth 1 The command produces a flow file containing sub folders each line. So, split those outputs, then use ExtractText to extract sub folder path to an attribute 'targetDir'. You need to add a dynamic property by clicking the plus sign, then name the property with an attribute name to extract the content to. The Value is a regular expression to extract desired text from the content. Used ExecuteStreamCommand this time, to use incoming flow files. - Command Path: zip - Command Arguments: -r;${targetDir}-${now():format('yyyyMMdd')}.zip;${targetDir} - Ignore STDIN: true Then it will zip each sub folders. Thanks, Koji
... View more
04-24-2017
05:38 PM
Thanks a lot for the great tutorial. How could this be extended to not only listen to a web socket, but rather periodically send control commands like: https://blockchain.info/api/api_websocket for example `{"op":"unconfirmed_sub"}`?
... View more
09-11-2018
11:01 PM
Hello In this Solution, Nifi Cluster also deployed on docker? Thanks
... View more
11-25-2016
11:57 AM
Hi @mayki wogno The first error message was also written by the same error with the second error message. The processor reported the error twice, because it logged an error message when the ListHDFS processor caught the exception, then re-throw it, and NiFi framework caught the exception and logged another error message. When NiFi framework catches an exception thrown by a processor, it yields the processor for the amount of time specified by 'Yield Duration'. Once the processor successfully accesses core-site.xml and hdfs-site.xml, both error messages will be cleared.
... View more
11-24-2016
12:30 AM
@Saikrishna Tarapareddy I hadn't read this comment when I wrote a reply few seconds ago. Glad to hear that it worked!
... View more
10-20-2016
02:18 PM
@Timothy : I got this flow I'm use for testing this curl curl -i -v -F file=@/var/opt/hosting/log/flume/flume-a1.log http://nifi011:10000/contentListener
RouteOnAttribute has properties (abc ==> ${filename:contains('flume')} Why flume log not send tu HDFS ?
... View more
12-20-2017
12:40 PM
thanks it worked!!
... View more
04-12-2017
04:11 PM
1 Kudo
@Vadim Vaks I'm trying to do the same thing with Atlas 0.8. But I can't delete entries within inputs or outputs array with this method. With V2 API, elements didn't change. With V1 API, new elements are added even if I removed some from inputs array. The inputs had two entries before POST request, and I posted a single input entry and it got added: "inputs": [
{
"guid": "688ed1ee-222c-4416-8bf4-ba107b7fbc2c",
"typeName": "kafka_topic"
},
{
"guid": "bf3784db-fa59-4803-ad41-c5653f242f6f",
"typeName": "kafka_topic"
},
{
"guid": "688ed1ee-222c-4416-8bf4-ba107b7fbc2c",
"typeName": "kafka_topic"
}
], Please let me know how to remove elements from inputs/outputs with Atlas 0.8. Thanks!
... View more
11-30-2016
01:44 PM
2016-11-29 14:50:59,544 INFO [Write-Ahead
Local State Provider Maintenance] org.wali.MinimalLockingWriteAheadLog
org.wali.MinimalLockingWriteAheadLog@6dc5e857 checkpointed with 3
Records and 0 Swap Files in 25 milliseconds (Stop-the-world time = 11
milliseconds, Clear Edit Logs time = 9 millis), max Transaction ID 8
2016-11-29 14:51:06,659 WARN [Timer-Driven Process Thread-7]
o.apache.hadoop.hdfs.BlockReaderFactory I/O error constructing remote
block reader. java.io.IOException: An existing connection was forcibly
closed by the remote host at sun.nio.ch.SocketDispatcher.read0(Native
Method) ~[na:1.8.0_111] 2016-11-29 14:51:06,659 WARN [Timer-Driven Process Thread-7] org.apache.hadoop.hdfs.DFSClient Failed to connect to sandbox.hortonworks.com/127.0.0.1:50010
for block, add to deadNodes and continue. java.io.IOException: An
existing connection was forcibly closed by the remote host
java.io.IOException: An existing connection was forcibly closed by the
remote host at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111] 2016-11-29
14:51:06,660 WARN [Timer-Driven Process Thread-7]
org.apache.hadoop.hdfs.DFSClient Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv No live nodes contain current block
Block locations: 172.17.0.2:50010 Dead nodes: 172.17.0.2:50010. Throwing
a BlockMissingException 2016-11-29 14:51:06,660 WARN [Timer-Driven
Process Thread-7] org.apache.hadoop.hdfs.DFSClient Could not obtain
block: BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv No live nodes contain current block
Block locations: 172.17.0.2:50010 Dead nodes: 172.17.0.2:50010. Throwing
a BlockMissingException 2016-11-29 14:51:06,660 WARN [Timer-Driven
Process Thread-7] org.apache.hadoop.hdfs.DFSClient DFS Read
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv at
org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:889)
[hadoop-hdfs-2.6.2.jar:na] 2016-11-29 14:51:06,660 ERROR [Timer-Driven Process Thread-7]
o.apache.nifi.processors.hadoop.GetHDFS
GetHDFS[id=abb1f7a5-0158-1000-f1d4-ef83203b4aa1] Error retrieving file hdfs://sandbox.hortonworks.com:8020/user/admin/Data/trucks.csv
from HDFS due to
org.apache.nifi.processor.exception.FlowFileAccessException: Failed to
import data from
org.apache.hadoop.hdfs.client.HdfsDataInputStream@7bea77c5 for
StandardFlowFileRecord[uuid=34551c53-72ad-40fa-927d-5ac60fe6d83e,claim=,offset=0,name=712611918461157,size=0]
due to org.apache.nifi.processor.exception.FlowFileAccessException:
Unable to create ContentClaim due to
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv:
org.apache.nifi.processor.exception.FlowFileAccessException: Failed to
import data from
org.apache.hadoop.hdfs.client.HdfsDataInputStream@7bea77c5 for
StandardFlowFileRecord[uuid=34551c53-72ad-40fa-927d-5ac60fe6d83e,claim=,offset=0,name=712611918461157,size=0]
due to org.apache.nifi.processor.exception.FlowFileAccessException:
Unable to create ContentClaim due to
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv 2016-11-29 14:51:06,661 ERROR
[Timer-Driven Process Thread-7] o.apache.nifi.processors.hadoop.GetHDFS
org.apache.nifi.processor.exception.FlowFileAccessException: Failed to
import data from
org.apache.hadoop.hdfs.client.HdfsDataInputStream@7bea77c5 for
StandardFlowFileRecord[uuid=34551c53-72ad-40fa-927d-5ac60fe6d83e,claim=,offset=0,name=712611918461157,size=0]
due to org.apache.nifi.processor.exception.FlowFileAccessException:
Unable to create ContentClaim due to
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv at
org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2479)
~[na:na] Caused by:
org.apache.nifi.processor.exception.FlowFileAccessException: Unable to
create ContentClaim due to org.apache.hadoop.hdfs.BlockMissingException:
Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv at
org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2472)
~[na:na] ... 14 common frames omitted
... View more
- « Previous
- Next »