I am running sqoop import job to load data into hive table from Sql Server database, but after completion of job everytime data is stored on /user/<user-name>/<table-name> hdfs directory only.
I also tried to set --target-dir to /tmp for temporarily storing data before moving to hive table, but no success and data is still moved to /user/<user-name>/<table-name> hdfs directory.
sqoop import \ --connect "jdbc:sqlserver://<server-name>:<port-no>;database=<database-name>" \ --username <user-name> \ -P \ --table <table-name> \ -- --schema <schema-name> \ --hive-import \ --hive-database <hive-database-name> \ --hive-table <hive-table-name> \ -m 1
Is there anything I am missing?
Unfortunately issue still persists, but I got some insight from other comments in this post. I am also getting Kafka-Atlas Hook errors although job is completing successfully. I am still working on that and keep the group posted.
FYI: schema parameter in sqoop take double hyphen twice (-- --schema), it takes schema name for RDBMS database.