Member since
06-08-2017
1049
Posts
518
Kudos Received
312
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 11198 | 04-15-2020 05:01 PM | |
| 7097 | 10-15-2019 08:12 PM | |
| 3089 | 10-12-2019 08:29 PM | |
| 11425 | 09-21-2019 10:04 AM | |
| 4319 | 09-19-2019 07:11 AM |
12-12-2017
09:14 AM
@Shu - Thanks a lot, it is really helpful.
... View more
12-09-2017
05:13 PM
1 Kudo
@shu Thanks. It worked.
... View more
04-06-2018
09:41 AM
Iam new to NIFI i want to sqoop the data from Mysql to Hive,and i am using HDP 2.6,I have configured QueryDatabasetable Processor with DBCP and enable the same but when iam trying to start the processor i could see the below error.Please help me out.
... View more
12-19-2017
08:09 PM
Thanks, Shu.. The problem is fixed after I move the file to local. Thanks for the response.
... View more
12-21-2017
02:06 PM
@Shu I also have same requirement to zip a folder in hdfs directly. I am using mergeContent processor with merge format ZIP, But i am not able to get a single zipped file file after the merge content processor.
... View more
04-30-2018
06:29 PM
Hi, The sqoop is running ok however facing an issue at the end: Logging initialized using configuration in jar:file:/usr/hdp/2.6.4.0-91/hive/lib/hive-common-1.2.1000.2.6.4.0-91.jar!/hive-log4j.properties OK Time taken: 2.92 seconds FAILED: SemanticException Line 2:17 Invalid path ''hdfs://hostname/user/xyz/_sqoop/55cc1038f2924cc398e5e014061eb0f2_sample_table'': No files matching path hdfs://hostname/user/xyz/_sqoop/55cc1038f2924cc398e5e014061eb0f2_sample_table where xyz is the unix user who is running the sqoop operation. target dir is different and I can see data getting loaded into the designated target-dir. Any one encountered this? K
... View more
12-01-2017
07:25 PM
Thank you @Shu; it is running now.
... View more
11-29-2017
04:45 PM
3 Kudos
@balalaika Use SplitJson Processor with the following configs:- Input Json:- {
"objectIdFieldName": "ID",
"objectIds": [
64916,
67266,
67237,
64511]
} Output:- will be different flowfiles per objectId 64916 67266 67237 64511 Use Extract Text processor:- extract the content of the flowfile by adding new property id as (.*) Right now you will have id attribute added to the flowfile then you can use the id attribute when you are doing InvokeHTTP processor Flow:- GetHTTP (get response JSON) --> SplitJson -->ExtractText--> InvokeHTTP (new query per ID)
... View more
11-24-2017
02:22 PM
2 Kudos
@Mohammad Shamim
Do desc formatted on the table name,this command will display either the table is External (or) Managed and Location of the table. hive# desc formatted <db-name>.<db-table-name>; Check Size of the Database:- bash# hdfs dfs -count -h -v <hdfs-location> Example:- In the above screenshot you can view i ran desc formatted devrabbit table, The table type is Managed table and the location of the table is /user/hdfs/hive if you want to find the size of the above location then do hdfs dfs -count -h -v /user/hdfs/hive It will display the size of the directory.
... View more
11-23-2017
12:58 PM
@Mohamed Hossam
I think you are missing space in search value property. Use the below regex in search value property ^(.*?) (.*?) IP (.*?) > (.*?) .*$ (or) ([^\s]+)\s([^\s]+)\sIP\s(.*)\s>\s([^\s]+).* Use any of the above regex's. Config:-
... View more