Created 10-09-2018 01:05 PM
I am running sqoop import job to load data into hive table from Sql Server database, but after completion of job everytime data is stored on /user/<user-name>/<table-name> hdfs directory only.
I also tried to set --target-dir to /tmp for temporarily storing data before moving to hive table, but no success and data is still moved to /user/<user-name>/<table-name> hdfs directory.
sqoop import \ --connect "jdbc:sqlserver://<server-name>:<port-no>;database=<database-name>" \ --username <user-name> \ -P \ --table <table-name> \ -- --schema <schema-name> \ --hive-import \ --hive-database <hive-database-name> \ --hive-table <hive-table-name> \ -m 1
Stack details:
HDP3.0
Sqoop1.4
Hive3.1
Is there anything I am missing?
Created 10-09-2018 06:00 PM
Created 10-10-2018 05:19 AM
Any updates on this thread, if the above steps helped you resolve the issue, please "Accept" the answer to close the thread so other HCC members could reference it.
Created 10-10-2018 10:56 AM
Unfortunately issue still persists, but I got some insight from other comments in this post. I am also getting Kafka-Atlas Hook errors although job is completing successfully. I am still working on that and keep the group posted.
FYI: schema parameter in sqoop take double hyphen twice (-- --schema), it takes schema name for RDBMS database.