i have import sql server data through Sqoop into hdfs into -target-dir /user/hive/warehouse/db_abc.db/t_tableName . I have import into hdfs as i need to import using the --query switch. Now i want to import data from hdfs into hive. I have tried
LOAD DATA INPATH '/user/hive/warehouse/db_abc.db/t_tableName' INTO TABLE db_abc.t_staging_table
Above command throwing invalid path : no files matching path error.
How i could load data from hdfs to hive or sql server to hive through sqoop?
Though i did not find any way to directly import into hive where --query switch is used in Sqoop.
you might have permissions issue reading that directory. you can specify a diff Sqoop dir with --target-dir or use --hive-import to directly import into Sqoop. both options are in Sqoop.
hdfs dfs -ls /user/hive/warehouse/db_abc.db/t_tableName
hdfs dfs -ls /user/hive/warehouse/db_abc.db
Check the output to see if have anything there to import
Hi @Nirvana India, you can also create an external Hive table located in your Sqoop target-dir. In this way, as soon as the Sqoop is over you will have your Hive table ready without any additional LOAD commands. See my answer here for an example. If you also need an internal Hive table, let's say in ORC format you can do so-called CREATE TABLE AS SELECT:
CREATE TABLE t1_orc STORED AS ORC AS SELECT * FROM <your-external-table>;
use export cmd $ sqoop export \
--connect jdbc:mysql://localhost/db \--username root \ --table employee \ --export-dir /emp/emp_data