Created 12-12-2018 01:14 PM
I am trying to upload data to a hive table using the sqoop command but the job gets stuck in between and never completes. Job gets stuck at "Connecting to jdbc:hive2:///default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2" . I have verified that the data gets uploaded to HDFS directory but doesn't create a hive table. The Sqoop script that I have been using is as follows " sqoop import --connect "jdbc:sqlserver://;database=;username=;password=" --table --hive-import --create-hive-table --hive-table default. --target-dir /user/hadoop/ --split-by UPDATE_DATE --hive-overwrite -m 1" .
Could you please help me with a workaround?
Created 12-12-2018 10:21 PM
Just wondering if you gave an argument to the --table parameter?
Created 12-13-2018 12:29 AM
I think you are hitting the same thing as posted in here
https://community.hortonworks.com/questions/214980/sqoop-import-hung-hive-import-hdp-300.html
Basically, hivecli is removed from HDP 3.0 so it would use beeline and beeline needs a login and the hang that you are seeing is basically expecting a username and password. If you are not willing to perform this task, then I would suggest you to use beeline-hs2-connection.xml as specified in here: https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-Usinghive-s...
or modify your sqoop syntax to use hcatalog.
Hope this helps