Member since
12-22-2016
8
Posts
1
Kudos Received
0
Solutions
08-29-2018
11:50 AM
Command sqoop-create-hive-table has the same effect!!! In the HDP-3.0, the sqoop-hive-import should work? Or will I have to return HDP-2.6?
$ sqoop-create-hive-table --connect jdbc:mysql://zabbix.amb.corp/zabbix --username *** --password *** --table alerts --hive-table t1 --verbose
...
18/08/29 14:40:07 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE IF NOT EXISTS `history_uint` ( `itemid` BIGINT, `clock` INT, `value` BIGINT, `ns` INT) COMMENT 'Imported by sqoop on 2018/08/29 14:40:07' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
18/08/29 14:40:08 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://hdp01.amb.corp:8020/user/sqoop/history_uint' INTO TABLE `history_uint`
18/08/29 14:40:09 INFO hive.HiveImport: Loading uploaded data into Hive
18/08/29 14:40:11 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
18/08/29 14:40:11 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
18/08/29 14:40:11 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
18/08/29 14:40:11 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
18/08/29 14:40:11 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
18/08/29 14:40:14 INFO hive.HiveImport: Connecting to jdbc:hive2://hdp03.amb.corp:2181,hdp02.amb.corp:2181,hdp01.amb.corp:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
... View more
08-28-2018
11:42 AM
In the journals HS2 and Zookeeper no problems were found. But, when connecting to Hive with Beeline, the server asks for the login and password. Could something like this happen when Sqoop connected to Hive? And how to fix it? [hive@hdp00]$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://hdp03.amb.corp:2181,hdp02.amb.corp:2181,hdp01.amb.corp:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
Enter username for jdbc:hive2://hdp03.amb.corp:2181,hdp02.amb.corp:2181,hdp01.amb.corp:2181/default: hive
Enter password for jdbc:hive2://hdp03.amb.corp:2181,hdp02.amb.corp:2181,hdp01.amb.corp:2181/default: *********
18/08/28 14:35:21 [main]: INFO jdbc.HiveConnection: Connected to hdp02:10000
Connected to: Apache Hive (version 3.1.0.3.0.0.0-1634)
Driver: Hive JDBC (version 3.1.0.3.0.0.0-1634)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 3.1.0.3.0.0.0-1634 by Apache Hive
0: jdbc:hive2://hdp03.amb.corp:2181,hdp02.amb>
... View more
08-28-2018
09:46 AM
1 Kudo
I'm trying to import the Mysql table into Hive using the Sqoop --hive-import parameter.
The import process stops at step hive.HiveImport: Connecting to jdbc:hive2, and remains in this state forever.
All service tests in Ambari are successful. The connection string to Hive works successfully in the Superset. Import options: sqoop-import --connect jdbc:mysql://mysql_host/zabbix --username *** --password *** \
--table alerts \
--hive-import \
--create-hive-table \
--hive-table zabbix.alerts \
--verbose
Output: 18/08/28 12:14:42 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7.3.0.0.0-1634
...
18/08/28 12:14:48 INFO mapreduce.ImportJobBase: Beginning import of alerts
...
18/08/28 12:14:58 INFO mapreduce.Job: Running job: job_1535384317331_0003
18/08/28 12:15:06 INFO mapreduce.Job: Job job_1535384317331_0003 running in uber mode : false
18/08/28 12:15:06 INFO mapreduce.Job: map 0% reduce 0%
18/08/28 12:15:14 INFO mapreduce.Job: map 100% reduce 0%
18/08/28 12:15:15 INFO mapreduce.Job: Job job_1535384317331_0003 completed successfully
18/08/28 12:15:15 INFO mapreduce.Job: Counters: 32
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=963956
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=450
HDFS: Number of bytes written=1472709
HDFS: Number of read operations=24
HDFS: Number of large read operations=0
HDFS: Number of write operations=8
Job Counters
Launched map tasks=4
Other local map tasks=4
Total time spent by all maps in occupied slots (ms)=63339
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=21113
Total vcore-milliseconds taken by all map tasks=21113
Total megabyte-milliseconds taken by all map tasks=32429568
Map-Reduce Framework
Map input records=3152
Map output records=3152
Input split bytes=450
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=312
CPU time spent (ms)=10040
Physical memory (bytes) snapshot=1009577984
Virtual memory (bytes) snapshot=13137891328
Total committed heap usage (bytes)=597688320
Peak Map Physical memory (bytes)=258195456
Peak Map Virtual memory (bytes)=3291807744
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=1472709
18/08/28 12:15:15 INFO mapreduce.ImportJobBase: Transferred 1.4045 MB in 25.1313 seconds (57.2271 KB/sec)
18/08/28 12:15:15 INFO mapreduce.ImportJobBase: Retrieved 3152 records.
18/08/28 12:15:15 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table alerts
...
18/08/28 12:15:17 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE `zabbix.alerts` ( `alertid` BIGINT, `actionid` BIGINT, `eventid` BIGINT, `userid` BIGINT, `clock` INT, `mediatypeid` BIGINT, `sendto` STRING, `subject` STRING, `message` STRING, `status` INT, `retries` INT, `error` STRING, `esc_step` INT, `alerttype` INT, `p_eventid` BIGINT, `acknowledgeid` BIGINT) COMMENT 'Imported by sqoop on 2018/08/28 12:15:17' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
18/08/28 12:15:17 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://hdp01.amb.corp:8020/user/hive/alerts' INTO TABLE `zabbix.alerts`
18/08/28 12:15:17 INFO hive.HiveImport: Loading uploaded data into Hive
18/08/28 12:15:20 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
18/08/28 12:15:20 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
18/08/28 12:15:20 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.0.0.0-1634/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
18/08/28 12:15:20 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
18/08/28 12:15:20 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
18/08/28 12:15:24 INFO hive.HiveImport: Connecting to jdbc:hive2://hdp03.amb.corp:2181,hdp02.amb.corp:2181,hdp01.amb.corp:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
At the last step, imports hang! What could be the problem?
... View more
Labels: