Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

sqoop import error from mysql database

avatar

Dear All,

I am able to get the connectivity from sqoop to Mysql. while importing data from mysql to sqoop i am getting below errot please find the attached document and help me to resolve this issue.sqoop-eror.txt

7 REPLIES 7

avatar
Master Mentor

@shrinivas acharya

Can you open this link http://hdpcluster2.blpclean.com:8088/cluster/app/application_1528094305283_0011 and copy error message in the logs.

Is SSL enabled?


avatar

Sorry i am little new to Hadoop Sqoop, I don't know how to check weather SSL is enabled or not.

avatar

sqoop import --connect "jdbc:mysql://192.168.1.23:3306/blp?useSSL=false" --username root --password Root@123 --table emp --hive-import --target-dir /blp/test/emp --autoreset-to-one-mapper

Just Now i tried using this command also again i am getting same error.

hdfs@hdpmaster:/usr/hdp/2.6.0.3-8/sqoop/lib$ sqoop import --connect "jdbc:mysql://192.168.1.23:3306/blp?useSSL=false" --username root --password Root@123 --table emp --hive-import --target-dir /blp/test/emp --autoreset-to-one-mapper Warning: /usr/hdp/2.6.0.3-8/hbase does not exist! HBase imports will fail. Please set $HBASE_HOME to the root of your HBase installation. Warning: /usr/hdp/2.6.0.3-8/accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 18/06/04 18:06:53 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.0.3-8 18/06/04 18:06:53 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 18/06/04 18:06:53 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 18/06/04 18:06:53 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 18/06/04 18:06:53 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 18/06/04 18:06:53 INFO tool.CodeGenTool: Beginning code generation 18/06/04 18:06:54 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1 18/06/04 18:06:54 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1 18/06/04 18:06:54 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.0.3-8/hadoop-mapreduce Note: /tmp/sqoop-hdfs/compile/371c0e36e1f67bef930fa5546bbabd75/emp.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 18/06/04 18:06:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/371c0e36e1f67bef930fa5546bbabd75/emp.jar 18/06/04 18:06:55 WARN manager.MySQLManager: It looks like you are importing from mysql. 18/06/04 18:06:55 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 18/06/04 18:06:55 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 18/06/04 18:06:55 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 18/06/04 18:06:56 WARN manager.SqlManager: Split by column not provided or can't be inferred. Resetting to one mapper 18/06/04 18:06:56 INFO mapreduce.ImportJobBase: Beginning import of emp 18/06/04 18:06:56 INFO client.RMProxy: Connecting to ResourceManager at hdpcluster2.blpclean.com/192.168.1.12:8050 18/06/04 18:06:56 INFO client.AHSProxy: Connecting to Application History server at hdpmaster.blpclean.com/192.168.1.10:10200 18/06/04 18:07:04 INFO db.DBInputFormat: Using read commited transaction isolation 18/06/04 18:07:04 INFO mapreduce.JobSubmitter: number of splits:1 18/06/04 18:07:04 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1528094305283_0014 18/06/04 18:07:04 INFO impl.YarnClientImpl: Submitted application application_1528094305283_0014 18/06/04 18:07:04 INFO mapreduce.Job: The url to track the job: http://hdpcluster2.blpclean.com:8088/proxy/application_1528094305283_0014/ 18/06/04 18:07:04 INFO mapreduce.Job: Running job: job_1528094305283_0014 18/06/04 18:07:22 INFO mapreduce.Job: Job job_1528094305283_0014 running in uber mode : false 18/06/04 18:07:22 INFO mapreduce.Job: map 0% reduce 0% 18/06/04 18:07:23 INFO mapreduce.Job: Job job_1528094305283_0014 failed with state FAILED due to: Application application_1528094305283_0014 failed 2 times due to AM Container for appattempt_1528094305283_0014_000002 exited with exitCode: -1000 For more detailed output, check the application tracking page: http://hdpcluster2.blpclean.com:8088/cluster/app/application_1528094305283_0014 Then click on links to logs of each attempt. Diagnostics: ExitCodeException exitCode=2: gzip: /hadoop/yarn/local/filecache/15_tmp/tmp_mapreduce.tar.gz: unexpected end of file tar: Unexpected EOF in archive tar: Unexpected EOF in archive tar: Error is not recoverable: exiting now Failing this attempt. Failing the application. 18/06/04 18:07:23 INFO mapreduce.Job: Counters: 0 18/06/04 18:07:23 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 18/06/04 18:07:23 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 26.1843 seconds (0 bytes/sec) 18/06/04 18:07:23 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 18/06/04 18:07:23 INFO mapreduce.ImportJobBase: Retrieved 0 records. 18/06/04 18:07:23 ERROR tool.ImportTool: Error during import: Import job failed!

avatar

error screenshot erro.jpg

avatar
Master Mentor

@shrinivas acharya

Here is a valid command to import emp from MySQL to emp in hive

sqoop import 
--connect jdbc:mysql://localhost:3306/blp
--username root 
--password Root@123 
--table emp 
--target-dir /blp/test/emp 
--fields-terminated-by "," 
--hive-import 
--create-hive-table 
--hive-table emp

Please revert

avatar

@Geoffrey Shelton Okot

HI I tried with your code for data inserting same error i am getting.

in sqoop i am able to view the mysql database and tables only while inserting facing issue.

avatar
New Contributor

@shrinivas Can you please paste the sqoop import command you are trying to execute?

,