Support Questions

Find answers, ask questions, and share your expertise

Sqoop stuck. What should I do?

18/11/15 08:47:40 INFO mapreduce.ImportJobBase: Transferred 76 bytes in 32.7027 seconds (2.324 bytes/sec) 18/11/15 08:47:40 INFO mapreduce.ImportJobBase: Retrieved 19 records. 18/11/15 08:47:40 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table FAFAFA 18/11/15 08:47:40 DEBUG hive.HiveImport: Hive.inputTable: FAFAFA 18/11/15 08:47:40 DEBUG hive.HiveImport: Hive.outputTable: default.fafafa 18/11/15 08:47:40 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM FAFAFA t WHERE 1=0 18/11/15 08:47:40 DEBUG manager.SqlManager: Execute getColumnInfoRawQuery : SELECT t.* FROM FAFAFA t WHERE 1=0 18/11/15 08:47:40 DEBUG manager.OracleManager$ConnCache: Got cached connection for jdbc:oracle:thin:@192.168.1.93:1521:orcl/sqoop 18/11/15 08:47:40 INFO manager.OracleManager: Time zone has been set to GMT 18/11/15 08:47:40 DEBUG manager.SqlManager: Using fetchSize for next query: 1000 18/11/15 08:47:40 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM FAFAFA t WHERE 1=0 18/11/15 08:47:40 DEBUG manager.SqlManager: Found column NAME of type [12, 500, 0] 18/11/15 08:47:40 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:@192.168.1.93:1521:orcl/sqoop 18/11/15 08:47:40 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE IF NOT EXISTS `default.fafafa` ( `NAME` STRING) COMMENT 'Imported by sqoop on 2018/11/15 00:47:40' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE 18/11/15 08:47:40 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://hunan001:8020/user/hive/FAFAFA' OVERWRITE INTO TABLE `default.fafafa` 18/11/15 08:47:40 INFO hive.HiveImport: Loading uploaded data into Hive 18/11/15 08:47:43 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings. 18/11/15 08:47:43 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 18/11/15 08:47:43 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 18/11/15 08:47:43 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 18/11/15 08:47:43 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 18/11/15 08:47:46 INFO hive.HiveImport: Connecting to jdbc:hive2://hunan001:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

1 REPLY 1

Expert Contributor

@Jack

You would want to create a file called beeline-hs2-connection.xml under /etc/hive/conf location on the node where you are running the sqoop command and have this information in it so when sqoop code come to hive import with HS2 it would use the below access credentials to load the data.

One thing to ensure here is that the hive user needs to have access (read/write/execute) to the temp location from where hive user will pick up the data and move it to the hive table location.

<?xml version="1.0"?>
<configuration>
<property>
  <name>beeline.hs2.connection.user</name>
  <value>hive</value>
</property>
<property>
  <name>beeline.hs2.connection.password</name>
  <value>hive</value>
</property>
</configuration>

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.