Member since
07-16-2018
14
Posts
1
Kudos Received
0
Solutions
07-27-2018
04:41 PM
Hi @Matt Burgess How about if just wanna keep the content as Attribute? Like my scenario is that I want a user to give parameters through a csv which I can parse and use them attribute , for example like User wants to import a tbale , he will write table in csv and that I'll use as an attribite in flowfile. My present approach is : ListFile->FetchFile->SplitFile->extractText->updateAttribute. But doesn't seems to be wroking out. Any suggestions?
... View more
07-25-2018
11:15 AM
Hi @Matt Burgess I applied your solution which works. But for some reason it executing same select statement over a table thus giving FILE ALREADY EXIST error in Puthdfs. Is it because I have not supplied a MAX VALUE COL? What if we dont have a pkey which can be used as a MAX VALUE COL. I hope I make sense. If not please do let me know.
... View more
07-24-2018
05:18 PM
Thanks @Timothy Spann . I have worked with sqoop but sqoop takes a lot of time time to import data. Right now I'm looking for an alternative of sqoop. With sqoop we will have to separately main logs but with Nifi and I can have inbuilt logs and dont have worry about scheduling so Yes sqoop is the last option but NiFi is something I want to try without any dependencies on sqoop.
... View more
07-24-2018
04:22 PM
1 Kudo
Thank @Matt Burgess. I'll try the solution and keep everyone updated.
... View more
07-24-2018
04:04 PM
I'm trying to ingest 500 tables from Oracle to hdfs. My workflow is QueryDataBaseTable-> convertavrotoorc->Puthdfs->replacetext->puthiveql. This is working pretty nicely. But as I have to injest 500 tables ingesting tables 1 by 1 wouldn't make sense. How can I give list of tables and make it ingest automatically. LIke schema level ingestion. I was trying to read a file(which contains all schema.tables) , split it create nifi expression and pass it to querydatabasetable but querydatabasetable doesn't take any input flowfile. Any help will be appreciated. Thanks
... View more
Labels:
- Labels:
-
Apache NiFi
07-24-2018
03:55 PM
The solution provided by @Harald via link is good enough as a solution. The only prequesite is that the cluster should be on top of Azure Data Lake and not azure datastorage. If its a cluster then solution should be replicated on all nodes. If it's only Edge node then solution should only be applied only edge node. By solution I mean the solution in https://community.hortonworks.com/content/kbentry/71916/connecting-to-azure-data-lake-from-a-nifi-dataflow.html. Thanks
... View more
07-18-2018
11:11 AM
@Harald Berghoff I tried the steps in the article. But I'm getting the same error. I was hoping that this would atleast change the error but it didn't . Any suggestions?
... View more
07-18-2018
08:19 AM
@Harald Berghoff I will try this and try to push data into HDFS. I'll update . Thanks
... View more
07-16-2018
09:53 AM
Hi I can read data from sqlserver. Converted it into ORC and now trying to push it into HDFS. My PutHDFS config are below. I'm just putting data into hdfs to PutHDFS is terminated at success. Can you guys please help me get past the error below
... View more
Labels: