Created 08-09-2017 01:54 PM
Hi,
I want to understand the below.
1. I have placed a log file in to hdfs
2. Then processed the log file in hdfs to csv format, say location of the csv is "/my/loc/" (both steps (1&2) using nifi).
3. created a table in hive "create external table load_table (col1, col2, ...) fields terminated by ',' lines terminated by '\n' stored as textfile location '/my/loc'
-----I want to do as below-----
4. copy the above data from load_table to orig_table(internal table)
5. remove the csv file (as the data is stream data in to csv, it may grow endless).
Things need clarification:
I have used puthiveql processor to do step 4, as insert overwrite table orig_table select col1, col2, ... from load_table; but I was getting error that "extraneous input ";" expecting EOF near <EOF>. I thought because of ";" in the insert statement, removed it but got "org.apache.hadoop.hive.ql.exec.movetask" error.
Also, for step 5, how can I remove csv, as streaming data is coming and storing in csv, this may grow huge.
Thank you
Created 08-10-2017 07:01 AM
Could you help with the complete log. It will be easier to sort it out if the complete log file is available.
Created 08-10-2017 07:01 AM
Could you help with the complete log. It will be easier to sort it out if the complete log file is available.
Created 11-23-2017 04:16 PM
@Bala Vignesh N V I have sorted it, thanks for your help