Member since
11-16-2015
892
Posts
649
Kudos Received
245
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5190 | 02-22-2024 12:38 PM | |
1337 | 02-02-2023 07:07 AM | |
3004 | 12-07-2021 09:19 AM | |
4154 | 03-20-2020 12:34 PM | |
13947 | 01-27-2020 07:57 AM |
12-30-2016
09:21 PM
In addition to @Pierre Villard's answer (which nicely gets the job done with ExecuteScript, I have a similar example here), since you are looking to do row-level operations (i.e. select columns from each row), you could use SplitText to split the large file into individual lines, then your ReplaceText above, then MergeContent to put the whole thing back together. I'm not sure which approach is faster per se; it would be an interesting exercise to try both.
... View more
06-28-2018
03:49 PM
In my experience, the connection error goes away if you remove "thrift://" from the URI.
... View more
12-02-2016
06:57 PM
What error(s) are you seeing? If it mentions Avro, then if your column names are in Chinese, it's likely that Avro does not accept them. This may be alleviated in NiFi 1.1.0 with NIFI-2262, but it would just replace non-Avro-compatible characters with underscores, so you may face a "duplicate field" exception. In that case you would need column aliases in your SELECT statement to use Avro-compatible names for the columns.
... View more
08-18-2016
01:42 PM
i found the problem. Reason was the hbase, i am sending same values as key so, it is impossible. After changing key values, everything working fine. Thanks
... View more
08-17-2016
01:43 AM
1 Kudo
@Randy Gelhausen NiFi JIRA to capture this idea: https://issues.apache.org/jira/browse/NIFI-2585
... View more
11-30-2016
01:44 PM
2016-11-29 14:50:59,544 INFO [Write-Ahead
Local State Provider Maintenance] org.wali.MinimalLockingWriteAheadLog
org.wali.MinimalLockingWriteAheadLog@6dc5e857 checkpointed with 3
Records and 0 Swap Files in 25 milliseconds (Stop-the-world time = 11
milliseconds, Clear Edit Logs time = 9 millis), max Transaction ID 8
2016-11-29 14:51:06,659 WARN [Timer-Driven Process Thread-7]
o.apache.hadoop.hdfs.BlockReaderFactory I/O error constructing remote
block reader. java.io.IOException: An existing connection was forcibly
closed by the remote host at sun.nio.ch.SocketDispatcher.read0(Native
Method) ~[na:1.8.0_111] 2016-11-29 14:51:06,659 WARN [Timer-Driven Process Thread-7] org.apache.hadoop.hdfs.DFSClient Failed to connect to sandbox.hortonworks.com/127.0.0.1:50010
for block, add to deadNodes and continue. java.io.IOException: An
existing connection was forcibly closed by the remote host
java.io.IOException: An existing connection was forcibly closed by the
remote host at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111] 2016-11-29
14:51:06,660 WARN [Timer-Driven Process Thread-7]
org.apache.hadoop.hdfs.DFSClient Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv No live nodes contain current block
Block locations: 172.17.0.2:50010 Dead nodes: 172.17.0.2:50010. Throwing
a BlockMissingException 2016-11-29 14:51:06,660 WARN [Timer-Driven
Process Thread-7] org.apache.hadoop.hdfs.DFSClient Could not obtain
block: BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv No live nodes contain current block
Block locations: 172.17.0.2:50010 Dead nodes: 172.17.0.2:50010. Throwing
a BlockMissingException 2016-11-29 14:51:06,660 WARN [Timer-Driven
Process Thread-7] org.apache.hadoop.hdfs.DFSClient DFS Read
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv at
org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:889)
[hadoop-hdfs-2.6.2.jar:na] 2016-11-29 14:51:06,660 ERROR [Timer-Driven Process Thread-7]
o.apache.nifi.processors.hadoop.GetHDFS
GetHDFS[id=abb1f7a5-0158-1000-f1d4-ef83203b4aa1] Error retrieving file hdfs://sandbox.hortonworks.com:8020/user/admin/Data/trucks.csv
from HDFS due to
org.apache.nifi.processor.exception.FlowFileAccessException: Failed to
import data from
org.apache.hadoop.hdfs.client.HdfsDataInputStream@7bea77c5 for
StandardFlowFileRecord[uuid=34551c53-72ad-40fa-927d-5ac60fe6d83e,claim=,offset=0,name=712611918461157,size=0]
due to org.apache.nifi.processor.exception.FlowFileAccessException:
Unable to create ContentClaim due to
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv:
org.apache.nifi.processor.exception.FlowFileAccessException: Failed to
import data from
org.apache.hadoop.hdfs.client.HdfsDataInputStream@7bea77c5 for
StandardFlowFileRecord[uuid=34551c53-72ad-40fa-927d-5ac60fe6d83e,claim=,offset=0,name=712611918461157,size=0]
due to org.apache.nifi.processor.exception.FlowFileAccessException:
Unable to create ContentClaim due to
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv 2016-11-29 14:51:06,661 ERROR
[Timer-Driven Process Thread-7] o.apache.nifi.processors.hadoop.GetHDFS
org.apache.nifi.processor.exception.FlowFileAccessException: Failed to
import data from
org.apache.hadoop.hdfs.client.HdfsDataInputStream@7bea77c5 for
StandardFlowFileRecord[uuid=34551c53-72ad-40fa-927d-5ac60fe6d83e,claim=,offset=0,name=712611918461157,size=0]
due to org.apache.nifi.processor.exception.FlowFileAccessException:
Unable to create ContentClaim due to
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv at
org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2479)
~[na:na] Caused by:
org.apache.nifi.processor.exception.FlowFileAccessException: Unable to
create ContentClaim due to org.apache.hadoop.hdfs.BlockMissingException:
Could not obtain block:
BP-1464254149-172.17.0.2-1477381671113:blk_1073742577_1761
file=/user/admin/Data/trucks.csv at
org.apache.nifi.controller.repository.StandardProcessSession.importFrom(StandardProcessSession.java:2472)
~[na:na] ... 14 common frames omitted
... View more
06-14-2017
02:23 PM
I confirmed this to be a bug in ConvertJSONToSQL, I have written up NIFI-4071, please see the Jira for details.
... View more
11-02-2016
06:11 PM
@sai d You must have VT-x features enabled within your computer BIOS. This is a common requirement for most Virtual Machines these days. What kind of computer are you using? Have you enabled VT-x?
... View more
- « Previous
- Next »