Member since
04-11-2016
471
Posts
325
Kudos Received
118
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2129 | 03-09-2018 05:31 PM | |
2695 | 03-07-2018 09:45 AM | |
2593 | 03-07-2018 09:31 AM | |
4466 | 03-03-2018 01:37 PM | |
2511 | 10-17-2017 02:15 PM |
08-29-2016
01:38 PM
1 Kudo
Hi @INDRANIL ROY, This should not be an issue and I am not really sure why you have this behavior. However here are two suggestions you may want to try: - Increase the number of threads for your ReplaceText processor (Setting 'concurrent tasks' in scheduling tab) - I think your regular expression may be consuming. I am guessing that your regex matches your whole input line and you just want to extract some columns. If this is the case, you could try using '^' and '$' to indicate start and end of the line respectively. Example: ^(.+)\|(.+)$ Just to help me understanding the situation (because the current configuration should not cause the issue you are reporting): - What is the NiFi version you are using? - Is the whole file coming in the ReplaceText processor? Or do you have one flow file per input line coming in the processor?
... View more
08-26-2016
06:27 PM
I'd prefer a solution without using the Sqoop metastore, but I'll try that anyway and let you know. I believe that in this case, that this is overridden by the connector code and not sure if we can do much.
... View more
08-26-2016
06:25 PM
Yes I did try the two options but if I do that this is interpreted as a Sqoop parameter and not as a parameter of the Teradata connector and the Sqoop command fails. In the documentation, it is recommended to use -Dteradata.db.input.* for import parameters and -Dteradata.db.output.* for export parameters, but no indication for the common parameters described at the end of the documentation.
... View more
08-26-2016
04:53 PM
Hi, What is the syntax to use the queryband option with TDCH? https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.6/bk_HortonworksConnectorForTeradata/content/ch_HortonworksConnectorForTeradata.html I tried a lot of -D syntax combinations without any luck... Thanks!
... View more
Labels:
- Labels:
-
Apache Sqoop
08-26-2016
03:43 PM
Hi, I am looking for a way to specify the name of my Sqoop job (export to Teradata). Here is the command I execute: /usr/bin/sqoop export \
-Dteradata.db.output.method=internal.fastload \
--connect <connection> \
--connection-manager org.apache.sqoop.teradata.TeradataConnManager \
--username myuser \
--password mypassword \
--table <table> \
--export-dir <sourceDir> \
--num-mappers <mappers> \
--verbose \
--input-null-non-string '' I tried the following options in the command: -Dmapreduce.job.id="myJob"
-Dmapreduce.job.name="myJob" \ But it seems to be ignored and my job is always named "TeradataSqoopExportJob". Is there a way to rename the job? Thanks!
... View more
Labels:
- Labels:
-
Apache Sqoop
08-17-2016
02:20 PM
Hi @Adi Jabkowsky, Could you share the stack trace you will find in <NIFI_HOME>/logs/nifi-app.log ? Also, could you indicate which version of NiFi you are running? It does sound like a bug, but it may already be resolved. Thanks.
... View more
08-16-2016
09:46 AM
1 Kudo
Hi @Timothy Spann, Thanks for reporting the issue, this has been captured here: https://issues.apache.org/jira/browse/NIFI-2546 It is fixed and will be delivered in the next release.
... View more
08-10-2016
03:35 PM
@Arturo Opsetmoen Amador, it really depends of your requirements and what will be pushing data to the queue, I recommend you to check this kind of details with your system admin depending of you are trying to achieve.
... View more
08-10-2016
01:36 PM
1 Kudo
To my point of view the situation is simple. - your queue does not exist if your script is not running because your queue is not durable and then your queue will exist if and only if your script is running (because when running it creates the exchange, the queue, etc). - NiFi requires that your queue exist to connect to it (as indicated in the logs). - if your run your script, the queue is created, then there is no error in NiFi. But since your script does not send data to the queue, NiFi has nothing to consume and transfer. What you want is something that send data to a queue (preferably a durable one) and use NiFi to pull data from this queue.
... View more
08-09-2016
12:26 PM
1 Kudo
If there are no messages sent to the queue, the consuming processor won't get any messages. Is your script publishing messages to the queue? The fact that the consume processors only works when you run your script let me think that you have defined a queue that is not persistent (not durable). It means that your queue does not exist if you don't have a producer connected to it. You should consider creating a persistent queue.
... View more