Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Sqoop import to hive failing

Sqoop import to hive failing

New Contributor


[root@sandbox-hdp hive_javafiles]# sqoop import-all-tables -m 1 --connect "jdbc:mysql://sandbox-hdp.hortonworks.com:3306/retail_db" --username=rajan --password=hadooprj --hive-import --hive-home /apps/hive/wareh ouse --hive-overwrite --hive-database retaildb --create-hive-table --outdir /root/sqoop_data/hive_javafiles



Warning: /usr/hdp/3.0.1.0-187/accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 19/07/22 08:48:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7.3.0.1.0-187 19/07/22 08:48:02 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 19/07/22 08:48:02 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 19/07/22 08:48:02 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 19/07/22 08:48:04 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. Mon Jul 22 08:48:10 UTC 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be establ ished by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/07/22 08:48:16 INFO tool.CodeGenTool: Beginning code generation 19/07/22 08:48:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1 19/07/22 08:48:17 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1 19/07/22 08:48:17 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/3.0.1.0-187/hadoop-mapreduce 19/07/22 08:48:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/b5dcd948603edc84e038d04c4aa130e8/categories.jar 19/07/22 08:48:56 WARN manager.MySQLManager: It looks like you are importing from mysql. 19/07/22 08:48:56 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 19/07/22 08:48:56 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 19/07/22 08:48:56 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 19/07/22 08:48:56 INFO mapreduce.ImportJobBase: Beginning import of categories 19/07/22 08:49:06 INFO client.RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.18.0.2:8050 19/07/22 08:49:13 INFO client.AHSProxy: Connecting to Application History server at sandbox-hdp.hortonworks.com/172.18.0.2:10200 19/07/22 08:49:15 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/root/.staging/job_1563776875770_0012 Mon Jul 22 08:49:38 UTC 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be establ ished by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/07/22 08:49:39 INFO db.DBInputFormat: Using read commited transaction isolation 19/07/22 08:49:39 INFO mapreduce.JobSubmitter: number of splits:1 19/07/22 08:49:42 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1563776875770_0012 19/07/22 08:49:42 INFO mapreduce.JobSubmitter: Executing with tokens: [] 19/07/22 08:49:45 INFO conf.Configuration: found resource resource-types.xml at file:/etc/hadoop/3.0.1.0-187/0/resource-types.xml 19/07/22 08:49:46 INFO impl.YarnClientImpl: Submitted application application_1563776875770_0012 19/07/22 08:49:47 INFO mapreduce.Job: The url to track the job: http://sandbox-hdp.hortonworks.com:8088/proxy/application_1563776875770_0012/ 19/07/22 08:49:47 INFO mapreduce.Job: Running job: job_1563776875770_0012 19/07/22 08:50:51 INFO mapreduce.Job: map 0% reduce 0% 19/07/22 08:51:58 INFO mapreduce.Job: map 100% reduce 0% 19/07/22 08:52:03 INFO mapreduce.Job: Job job_1563776875770_0012 completed successfully 19/07/22 08:52:05 INFO mapreduce.Job: Counters: 32 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=244760 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=85 HDFS: Number of bytes written=1029 HDFS: Number of read operations=6 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=1 Other local map tasks=1 Total time spent by all maps in occupied slots (ms)=231736 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=57934 Total vcore-milliseconds taken by all map tasks=57934 Total megabyte-milliseconds taken by all map tasks=59324416 Map-Reduce Framework Map input records=58 Map output records=58 Input split bytes=85 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=895 CPU time spent (ms)=15790 Physical memory (bytes) snapshot=212258816 Virtual memory (bytes) snapshot=2855272448 Total committed heap usage (bytes)=142082048 Peak Map Physical memory (bytes)=212258816 Peak Map Virtual memory (bytes)=2855272448 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=1029 19/07/22 08:52:05 INFO mapreduce.ImportJobBase: Transferred 1.0049 KB in 180.4778 seconds (5.7015 bytes/sec) 19/07/22 08:52:05 INFO mapreduce.ImportJobBase: Retrieved 58 records. Mon Jul 22 08:52:05 UTC 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be establ ished by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting [root@sandbox-hdp hive_javafiles]# sqoop import-all-tables -m 8 --connect "jdbc:mysql://sandbox-hdp.hortonworks.com:3306/retail_db" --username=rajan --password=hadooprj --hive-import --hive-home /user/hive_impor t --hive-overwrite --hive-database retaildb --create-hive-table --outdir /root/sqoop_data/hive_javafiles Warning: /usr/hdp/3.0.1.0-187/accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 19/07/22 09:03:37 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7.3.0.1.0-187 19/07/22 09:03:37 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 19/07/22 09:03:37 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 19/07/22 09:03:37 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 19/07/22 09:03:39 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. Mon Jul 22 09:03:44 UTC 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be establ ished by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/07/22 09:03:49 INFO tool.CodeGenTool: Beginning code generation 19/07/22 09:03:49 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1 19/07/22 09:03:49 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1 19/07/22 09:03:49 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/3.0.1.0-187/hadoop-mapreduce 19/07/22 09:04:14 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/f499f4fecd56d5ce25e2d50696f3d3e6/categories.jar 19/07/22 09:04:14 WARN manager.MySQLManager: It looks like you are importing from mysql. 19/07/22 09:04:14 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 19/07/22 09:04:14 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 19/07/22 09:04:14 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 19/07/22 09:04:15 INFO mapreduce.ImportJobBase: Beginning import of categories 19/07/22 09:04:23 INFO client.RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.18.0.2:8050 19/07/22 09:04:28 INFO client.AHSProxy: Connecting to Application History server at sandbox-hdp.hortonworks.com/172.18.0.2:10200 19/07/22 09:04:30 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/root/.staging/job_1563776875770_0013 Mon Jul 22 09:04:44 UTC 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be establ ished by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/07/22 09:04:45 INFO db.DBInputFormat: Using read commited transaction isolation 19/07/22 09:04:45 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`category_id`), MAX(`category_id`) FROM `categories` 19/07/22 09:04:45 INFO db.IntegerSplitter: Split size: 7; Num splits: 8 from: 1 to: 58 19/07/22 09:04:45 INFO mapreduce.JobSubmitter: number of splits:8 19/07/22 09:04:48 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1563776875770_0013 19/07/22 09:04:48 INFO mapreduce.JobSubmitter: Executing with tokens: [] 19/07/22 09:04:49 INFO conf.Configuration: found resource resource-types.xml at file:/etc/hadoop/3.0.1.0-187/0/resource-types.xml 19/07/22 09:04:50 INFO impl.YarnClientImpl: Submitted application application_1563776875770_0013 19/07/22 09:04:50 INFO mapreduce.Job: The url to track the job: http://sandbox-hdp.hortonworks.com:8088/proxy/application_1563776875770_0013/ 19/07/22 09:04:50 INFO mapreduce.Job: Running job: job_1563776875770_0013 19/07/22 09:05:49 INFO mapreduce.Job: Job job_1563776875770_0013 running in uber mode : false 19/07/22 09:05:49 INFO mapreduce.Job: map 0% reduce 0% 19/07/22 09:08:13 INFO mapreduce.Job: Task Id : attempt_1563776875770_0013_m_000002_0, Status : FAILED Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,100 milliseconds ago. The last packet sent successfully to the server was 25,753 milliseconds ago. at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:167) at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:158) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168) Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,100 milliseconds ago. The last packet sent successfully to the server was 25,753 milliseconds ago. at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220) at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:165) ... 10 more Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,100 milliseconds ago. The last packet sent successfully to the server was 25,753 milliseconds ago. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:990) at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:203) at com.mysql.jdbc.MysqlIO.negotiateSSLConnection(MysqlIO.java:4901) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1659) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1226) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2188) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2219) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2014) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:776) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:386) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:330) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:247) at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:300) at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213) ... 11 more Caused by: java.net.SocketException: Broken pipe (Write failed) at java.net.SocketOutputStream.socketWrite0(Native Method) at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111) at java.net.SocketOutputStream.write(SocketOutputStream.java:155) at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431) at sun.security.ssl.OutputRecord.write(OutputRecord.java:417) at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:879) at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:850) at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:720) at sun.security.ssl.Handshaker.sendChangeCipherSpec(Handshaker.java:1144) at sun.security.ssl.ClientHandshaker.sendChangeCipherAndFinish(ClientHandshaker.java:1280) at sun.security.ssl.ClientHandshaker.serverHelloDone(ClientHandshaker.java:1190) at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:369) at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1037) at sun.security.ssl.Handshaker.process_record(Handshaker.java:965) at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1064) at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1367) at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1395) at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1379) at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:188) ... 30 more 19/07/22 09:08:13 INFO mapreduce.Job: Task Id : attempt_1563776875770_0013_m_000000_0, Status : FAILED Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,364 milliseconds ago. The last packet sent successfully to the server was 26,048 milliseconds ago. at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:167) at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:158) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168) Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,364 milliseconds ago. The last packet sent successfully to the server was 26,048 milliseconds ago. at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220) at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:165) ... 10 more Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,364 milliseconds ago. The last packet sent successfully to the server was 26,048 milliseconds ago. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:990) at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:203) at com.mysql.jdbc.MysqlIO.negotiateSSLConnection(MysqlIO.java:4901) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1659) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1226) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2188) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2219) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2014) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:776) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:386) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:330) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:247) at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:300) at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213) ... 11 more Caused by: java.net.SocketException: Broken pipe (Write failed) at java.net.SocketOutputStream.socketWrite0(Native Method) at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111) at java.net.SocketOutputStream.write(SocketOutputStream.java:155) at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431) at sun.security.ssl.OutputRecord.write(OutputRecord.java:417) at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:879) at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:850) at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:720) at sun.security.ssl.Handshaker.sendChangeCipherSpec(Handshaker.java:1144) at sun.security.ssl.ClientHandshaker.sendChangeCipherAndFinish(ClientHandshaker.java:1280) at sun.security.ssl.ClientHandshaker.serverHelloDone(ClientHandshaker.java:1190) at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:369) at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1037) at sun.security.ssl.Handshaker.process_record(Handshaker.java:965) at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1064) at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1367) at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1395) at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1379) at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:188) ... 30 more 19/07/22 09:08:13 INFO mapreduce.Job: Task Id : attempt_1563776875770_0013_m_000001_0, Status : FAILED Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,273 milliseconds ago. The last packet sent successfully to the server was 25,719 milliseconds ago. at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:167) at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:158) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168) Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,273 milliseconds ago. The last packet sent successfully to the server was 25,719 milliseconds ago. at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220) at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:165) ... 10 more Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 26,273 milliseconds ago. The last packet sent successfully to the server was 25,719 milliseconds ago. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:990) at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:203) at com.mysql.jdbc.MysqlIO.negotiateSSLConnection(MysqlIO.java:4901) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1659) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1226) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2188) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2219) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2014) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:776) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:386) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:330) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:247) at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:300) at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213) ... 11 more Caused by: java.net.SocketException: Broken pipe (Write failed) at java.net.SocketOutputStream.socketWrite0(Native Method) at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111) at java.net.SocketOutputStream.write(SocketOutputStream.java:155) at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431) at sun.security.ssl.OutputRecord.write(OutputRecord.java:417) at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:879) at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:850) at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:720) at sun.security.ssl.Handshaker.sendChangeCipherSpec(Handshaker.java:1144) at sun.security.ssl.ClientHandshaker.sendChangeCipherAndFinish(ClientHandshaker.java:1280) at sun.security.ssl.ClientHandshaker.serverHelloDone(ClientHandshaker.java:1190) at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:369) at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1037) at sun.security.ssl.Handshaker.process_record(Handshaker.java:965) at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1064) at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1367) at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1395) at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1379) at com.mysql.jdbc.ExportControlled.transformSocketToSSLSocket(ExportControlled.java:188) ... 30 more 19/07/22 09:09:54 INFO mapreduce.Job: map 38% reduce 0% 19/07/22 09:11:35 INFO mapreduce.Job: map 50% reduce 0% 19/07/22 09:11:36 INFO mapreduce.Job: map 75% reduce 0% 19/07/22 09:13:00 INFO mapreduce.Job: map 88% reduce 0% 19/07/22 09:13:01 INFO mapreduce.Job: map 100% reduce 0% 19/07/22 09:13:04 INFO mapreduce.Job: Job job_1563776875770_0013 completed successfully 19/07/22 09:13:05 INFO mapreduce.Job: Counters: 33 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=1958080 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=926 HDFS: Number of bytes written=1029 HDFS: Number of read operations=48 HDFS: Number of large read operations=0 HDFS: Number of write operations=16 Job Counters Failed map tasks=3 Launched map tasks=11 Other local map tasks=11 Total time spent by all maps in occupied slots (ms)=4579240 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=1144810 Total vcore-milliseconds taken by all map tasks=1144810 Total megabyte-milliseconds taken by all map tasks=1172285440 Map-Reduce Framework Map input records=58 Map output records=58 Input split bytes=926 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=10572 CPU time spent (ms)=131260 Physical memory (bytes) snapshot=1669877760 Virtual memory (bytes) snapshot=22823571456 Total committed heap usage (bytes)=1133510656 Peak Map Physical memory (bytes)=240259072 Peak Map Virtual memory (bytes)=2854375424 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=1029 19/07/22 09:13:05 INFO mapreduce.ImportJobBase: Transferred 1.0049 KB in 524.1751 seconds (1.9631 bytes/sec) 19/07/22 09:13:05 INFO mapreduce.ImportJobBase: Retrieved 58 records. Mon Jul 22 09:13:06 UTC 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be establ ished by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. 19/07/22 09:13:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1 19/07/22 09:13:06 INFO hive.HiveImport: Loading uploaded data into Hive 19/07/22 09:13:31 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings. 19/07/22 09:13:31 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] 19/07/22 09:13:31 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] 19/07/22 09:13:31 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 19/07/22 09:13:31 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] 19/07/22 09:14:01 INFO hive.HiveImport: Connecting to jdbc:hive2://sandbox-hdp.hortonworks.com:2181/default;password=hive;serviceDiscoveryMode=zooKeeper;user=hive;zooKeeperNamespace=hiveserver2 19/07/22 09:14:04 INFO hive.HiveImport: 19/07/22 09:14:04 [main]: INFO jdbc.HiveConnection: Connected to sandbox-hdp.hortonworks.com:10000 19/07/22 09:14:05 INFO hive.HiveImport: Connected to: Apache Hive (version 3.1.0.3.0.1.0-187) 19/07/22 09:14:05 INFO hive.HiveImport: Driver: Hive JDBC (version 3.1.0.3.0.1.0-187) 19/07/22 09:14:05 INFO hive.HiveImport: Transaction isolation: TRANSACTION_REPEATABLE_READ 19/07/22 09:14:06 INFO hive.HiveImport: 0: jdbc:hive2://sandbox-hdp.hortonworks.com:2> CREATE TABLE `retaildb`.`categories` ( `category_id` INT, `category_department_id` INT, `category_name` STRING) COMMENT 'Imp orted by sqoop on 2019/07/22 09:13:06' R 19/07/22 09:14:06 INFO hive.HiveImport: OW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE; 19/07/22 09:14:13 INFO hive.HiveImport: INFO : Compiling command(queryId=hive_20190722091406_5060496e-29ec-4595-94c8-92308bcfdc9d): CREATE TABLE `retaildb`.`categories` ( `category_id` INT, `category_department _id` INT, `category_name` STRING) COMMENT 'Imported by sqoop on 2019/07/22 09:13:06' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE 19/07/22 09:14:13 INFO hive.HiveImport: INFO : Semantic Analysis Completed (retrial = false) 19/07/22 09:14:13 INFO hive.HiveImport: INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) 19/07/22 09:14:13 INFO hive.HiveImport: INFO : Completed compiling command(queryId=hive_20190722091406_5060496e-29ec-4595-94c8-92308bcfdc9d); Time taken: 0.484 seconds 19/07/22 09:14:13 INFO hive.HiveImport: INFO :
	Executing command(queryId=hive_20190722091406_5060496e-29ec-4595-94c8-92308bcfdc9d): CREATE TABLE `retaildb`.`categories` ( `category_id` INT, `category_department _id` INT, `category_name` STRING) COMMENT 'Imported by sqoop on 2019/07/22 09:13:06' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE 19/07/22 09:14:13 INFO hive.HiveImport: INFO : Starting task [Stage-0:DDL] in serial mode 19/07/22 09:14:13 INFO hive.HiveImport: INFO : Completed executing command(queryId=hive_20190722091406_5060496e-29ec-4595-94c8-92308bcfdc9d); Time taken: 2.437 seconds 19/07/22 09:14:13 INFO hive.HiveImport: INFO : OK 19/07/22 09:14:13 INFO hive.HiveImport: No rows affected (6.77 seconds) 19/07/22 09:14:14 INFO hive.HiveImport: 0: jdbc:hive2://sandbox-hdp.hortonworks.com:2> LOAD DATA INPATH 'hdfs://sandbox-hdp.hortonworks.com:8020/user/root/categories' OVERWRITE INTO TABLE `retaildb`.`categories` ; 19/07/22 09:14:17 INFO hive.HiveImport: INFO : Compiling command(queryId=hive_20190722091414_95d51666-88d4-48f6-be08-da61c08071c9): LOAD DATA INPATH 'hdfs://sandbox-hdp.hortonworks.com:8020/user/root/categories ' OVERWRITE INTO TABLE `retaildb`.`categories` 19/07/22 09:14:17 INFO hive.HiveImport: INFO : Semantic Analysis Completed (retrial = false) 19/07/22 09:14:17 INFO hive.HiveImport: INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) 19/07/22 09:14:17 INFO hive.HiveImport: INFO : Completed compiling command(queryId=hive_20190722091414_95d51666-88d4-48f6-be08-da61c08071c9); Time taken: 0.538 seconds 19/07/22 09:14:17 INFO hive.HiveImport: INFO : Executing command(queryId=hive_20190722091414_95d51666-88d4-48f6-be08-da61c08071c9): LOAD DATA INPATH 'hdfs://sandbox-hdp.hortonworks.com:8020/user/root/categories ' OVERWRITE INTO TABLE `retaildb`.`categories` 19/07/22 09:14:17 INFO hive.HiveImport: INFO : Starting task [Stage-0:MOVE] in serial mode 19/07/22 09:14:17 INFO hive.HiveImport: INFO : Loading data to table retaildb.categories from hdfs://sandbox-hdp.hortonworks.com:8020/user/root/categories 19/07/22 09:14:17 INFO hive.HiveImport: ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. org.apache.hadoop.hive.ql.metadata.HiveException: Access denied: Unable to mov e source hdfs://sandbox-hdp.hortonworks.com:8020/user/root/categories/part-m-00000 to destination hdfs://sandbox-hdp.hortonworks.com:8020/warehouse/tablespace/managed/hive/retaildb.db/categories/base_0000001: Pe rmission denied: user=hive, access=WRITE, inode="/user/root/categories":root:hdfs:drwxr-xr-x 19/07/22 09:14:17 INFO hive.HiveImport: INFO : Completed executing command(queryId=hive_20190722091414_95d51666-88d4-48f6-be08-da61c08071c9); Time taken: 1.907 seconds 19/07/22 09:14:17 INFO hive.HiveImport: Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. org.apache.hadoop.hive.ql.metadata.HiveExcept ion: Access denied: Unable to move source hdfs://sandbox-hdp.hortonworks.com:8020/user/root/categories/part-m-00000 to destination hdfs://sandbox-hdp.hortonworks.com:8020/warehouse/tablespace/managed/hive/retail db.db/categories/base_0000001: Permission denied: user=hive, access=WRITE, inode="/user/root/categories":root:hdfs:drwxr-xr-x (state=08S01,code=1) 19/07/22 09:14:17 INFO hive.HiveImport: Closing: 0: jdbc:hive2://sandbox-hdp.hortonworks.com:2181/default;password=hive;serviceDiscoveryMode=zooKeeper;user=hive;zooKeeperNamespace=hiveserver2 19/07/22 09:14:18 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 2

Don't have an account?
Coming from Hortonworks? Activate your account here