Member since
04-09-2018
35
Posts
0
Kudos Received
0
Solutions
06-01-2020
04:24 AM
Hi Alim , thanks for this very wonderful post on using Nifi as CDC. Is there a provision in Nifi to use something similar for Oracle Database ?
... View more
02-26-2020
04:51 AM
Hi goto cd /kafka-logs under the kafka-logs goto vi meta.properties in that change broker.id=1001 to 1 then restart the kafka
... View more
02-01-2020
05:40 AM
I have uploaded a NiFi example template on https://github.com/maxbback/nifi-xml
... View more
01-30-2020
03:24 AM
How do I check if HS2 can reach port 2181?
... View more
12-16-2019
06:32 PM
Hi Alim, I just started to repeat our work and found the following error. My testing environment is : Ubuntu 16.04; MySQL and NiFi are installed in the system, I could access to MySQL via the terminal in a remote Windows PC. Does anyone know how to address the issue? Thanks. Kind regards, Raymond
... View more
08-07-2019
12:58 AM
@Matt Burgess I am also thinking to convert deeply nested xml to csv, and thinking of using either of ConvertRecord, UpdateRecord or JoltTransformRecord. What are the difference between UpdateRecord and JoltTransformRecord? Which one should be suitable?
... View more
03-25-2019
12:47 PM
Hi @Deb Yes, I wrote on article for doing CDC with NiFi for a MySQL database: https://community.hortonworks.com/articles/113941/change-data-capture-cdc-with-apache-nifi-version-1-1.html NiFi doesn't currently support MS SQL, but it appears there is a NiFi contributor who has it in the works: https://github.com/apache/nifi/pull/2231 Not sure if the solutions uses transactional logs or not.
... View more
11-28-2018
04:43 AM
you can refer this for HWC conf details https://community.hortonworks.com/content/kbentry/223626/integrating-apache-hive-with-apache-spark-hive-war.html
... View more
11-27-2018
09:35 AM
@Deb This doc will help you https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.1/integrating-hive/content/hive_hivewarehousesession_api_operations.html
... View more
03-21-2019
06:41 PM
I have the same issue. Any ideas? It is saying that certain params are not set, but they are set in Ambari. Have restarted Ambari and the Hive node, but no results.
... View more
08-14-2018
01:32 PM
Its working now. The format should be "(SELECT * FROM T_DISTRICT_TYPE_test) as abc". Without alias its not working
... View more
08-13-2018
06:06 PM
1 Kudo
@Deb, I think this is expected. Incremental imports mode can be used to retrieve only rows newer than some previously-imported set of rows. There is no direct way to achieve the use case you are looking for. Having said that you can refer this document : https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_data-access/content/incrementally-updating-hive-table-with-sqoop-and-ext-table.html Hope this helps.
... View more
07-18-2018
01:17 PM
@Deb This looks to be related to parquet way for coding being different in spark than in hive. Have you tried reading a different non parquet table? Try adding the following configuration for the parquet table: .config("spark.sql.parquet.writeLegacyFormat","true") If that does not work please open a new thread on this issue and we can follow up on this new thread. Thanks!
... View more
04-22-2019
11:58 AM
could you please share the nifi template for this. I am not able to achieve this.
... View more