04-18-2017 12:20 PM
A part of my Requirement is: A process will generate a file in HDFS, I've to import that to HIVE table and delete the file generated by the process
I've written a shell script as follows
schema_file1=testdb tbl_file1=mytable bkp_tbl_file1=mytable_bkp # Stage-1 hive -S -e "create table $schema_file1.$bkp_tbl_file1 row format delimited fields terminated by ',' stored as textfile as select * from $schema_file1.$tbl_file1;" # Stage-2 hive -S -e "drop table if exists $schema_file1.$tbl_file1;" # Stage-3 hive -S -e "import table $schema_file1.$tbl_file1 from '$HDFS_DATA_PATH/$tbl_file1';"
All the above 3 stages are getting schema name and table name as parameter and execute create, drop & import in HIVE. In which, create & drop the table are working fine but import table failed with following error
Failed with exception Invalid table name default.testdb.mytable FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
It seems, create & drop command using the schema & table properly but import command prefix it with "default" which is causing the trouble.
I am referring this link for import table command
Note: Import table is working fine, If I hardcode the db & table
Is it an issue with Import command in Hive?