Member since
06-08-2017
1049
Posts
517
Kudos Received
312
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
9882 | 04-15-2020 05:01 PM | |
5928 | 10-15-2019 08:12 PM | |
2410 | 10-12-2019 08:29 PM | |
9556 | 09-21-2019 10:04 AM | |
3501 | 09-19-2019 07:11 AM |
07-26-2019
01:30 AM
@Yogesh Kumar This is an expected behaviour from ConvertJSONToSql processor and if you want to view the values for columns check flowfile attributes. Refer to this link for more details regards to similar question https://community.hortonworks.com/questions/155492/can-someone-tell-me-how-to-update-a-record-in-orac.html Using ConvertJsonToSql is an older approach, instead use PutDatabaseRecord processor and define RecordReader controller service to read the incoming flowfile.
... View more
07-25-2019
03:32 AM
@Shailuk Could you give password in ListSFTP processor and then try to run the processor again?
... View more
07-25-2019
03:28 AM
@Thuy Le If the answer is helpful to resolve the issue, Login and Click on Accept button below to close this thread.This will help other community users to find answers quickly 🙂
... View more
07-23-2019
03:42 AM
1 Kudo
@Shailuk Schedule GetSFtp processor to run on Primary node with Run Schedule as 0 Sec then processor will try to run everypossible sec and pulls the file from configured directory. **NOTE** if we don't delete the file from the path then GetSFTP processor will pull the same file again and again because GetSFTP processor doesn't store the state. Correct Approach: Use ListSFTP + FetchSFTP processors and configure ListSFTP processor to run on primary node with Run schedule as 0 sec and this processor stores the state and runs incrementally by listing out only the newly added files in the directory. FetchSFTP processor fetches the files from the directory and then use PutFile processor to store the files into Local machine.
... View more
07-23-2019
01:30 AM
@Thuy Le If you have array then configure EvaluateJson processor and add new property as $.[2].length() //means if you have 2 json objects in array then attribute value will be 1 else empty. Then by using RouteOnAttribute processor check if the value is empty or 1 and route the flowfile accordingly. 2.What if there is empty json data? Still processor adds length attribute value as Empty String Set and in RouteOnAttribute processor you can check the value is Empty or not. Another way of checking will be by using RouteText processor and check is there any data in the flowfile content by adding Regular expression dynamic property in RouteText processor.
... View more
07-22-2019
04:17 AM
1 Kudo
@Thuy Le Use EvaluateJsonPath processor and add new property to determine length of the array. Then use RouteOnAttribute processor to check the array_length attribute value and route to SplitJson processor or some other processor. Flow: --other processors
EvaluateJsonProcessor //add array_length property to the processor
RouteOnAttribute //add dynamic properties to check the array_length attribute value.
|split
----------------------------------------------
|(array_length more than 2) |(array_length less than 2)
SplitJson Some other processors..
... View more
07-18-2019
04:15 PM
@Duraisankar S If the answer is helpful to resolve the issue, Login and Click on Accept button below to close this thread.This will help other community users to find answers quickly 🙂
... View more
07-18-2019
03:25 AM
@Duraisankar S You can run Major compaction on the partition in Hive, after the major compaction is done base-**** directory will be created. Then spark able to read the specific partition which have base-*** directories in it. But spark not able to read delta directories as there is an open Jira [SPARK-15348] about spark is not able to read acid table. I think starting from HDP-3.X HiveWareHouseConnector is able to support to read HiveAcid tables.
... View more
07-12-2019
02:07 PM
@Sampath Kumar As you have enabled Ranger authorization then DFS commands are restricted in Hive when authorization is enabled.
... View more
07-09-2019
03:35 AM
@Shawn Park You can use AttributesToJson processor. Then pass the flowfile to downstream processor and read the flowfile using ExecuteStreamCommand processor. (or) Use UpdateAttribute processor and use NiFi expression language to add attributes to the flowfile. then pass the flowfile to downstream processor and use attributes of the flowfile.
... View more