Support Questions
Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Sqoop Jobs get stuck while trying to connect hiveserver2

New Contributor

I am trying to upload data to a hive table using the sqoop command but the job gets stuck in between and never completes. Job gets stuck at "Connecting to jdbc:hive2:///default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2" . I have verified that the data gets uploaded to HDFS directory but doesn't create a hive table. The Sqoop script that I have been using is as follows " sqoop import --connect "jdbc:sqlserver://;database=;username=;password=" --table --hive-import --create-hive-table --hive-table default. --target-dir /user/hadoop/ --split-by UPDATE_DATE --hive-overwrite -m 1" .

Could you please help me with a workaround?

2 REPLIES 2

Mentor

@Navin Agarwala

Just wondering if you gave an argument to the --table parameter?

  • --table xxxxx\
  • ........
  • --hive-overwrite -m 1"

Expert Contributor
@Navin Agarwala,

I think you are hitting the same thing as posted in here

https://community.hortonworks.com/questions/214980/sqoop-import-hung-hive-import-hdp-300.html

Basically, hivecli is removed from HDP 3.0 so it would use beeline and beeline needs a login and the hang that you are seeing is basically expecting a username and password. If you are not willing to perform this task, then I would suggest you to use beeline-hs2-connection.xml as specified in here: https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-Usinghive-s...

or modify your sqoop syntax to use hcatalog.

Hope this helps