Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Exercise 1 Sqoop import fails with : Retrying connect to server:

Highlighted

Exercise 1 Sqoop import fails with : Retrying connect to server:

[cloudera@quickstart retail_db]$ sudo sqoop import-all-tables -m 1 --connect jdbc:mysql://192.168.67.128:3306/retail_db --username=retail_dba --password=cloudera --compression-codec=snappy --as-parquetfile --warehouse-dir=/user/hive/warehouse --hive-import
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/12/10 01:24:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.8.0
16/12/10 01:24:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/12/10 01:24:51 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/12/10 01:24:51 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/12/10 01:24:51 WARN tool.BaseSqoopTool: It seems that you're doing hive import directly into default
16/12/10 01:24:51 WARN tool.BaseSqoopTool: hive warehouse directory which is not supported. Sqoop is
16/12/10 01:24:51 WARN tool.BaseSqoopTool: firstly importing data into separate directory and then
16/12/10 01:24:51 WARN tool.BaseSqoopTool: inserting data into hive. Please consider removing
16/12/10 01:24:51 WARN tool.BaseSqoopTool: --target-dir or --warehouse-dir into /user/hive/warehouse in
16/12/10 01:24:51 WARN tool.BaseSqoopTool: case that you will detect any issues.
16/12/10 01:24:52 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/12/10 01:24:52 INFO tool.CodeGenTool: Beginning code generation
16/12/10 01:24:52 INFO tool.CodeGenTool: Will generate java class as codegen_categories
16/12/10 01:24:52 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/12/10 01:24:52 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/12/10 01:24:52 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/87f1249dafcb0ec12a9453e89aff774f/codegen_categories.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/12/10 01:24:56 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/87f1249dafcb0ec12a9453e89aff774f/codegen_categories.jar
16/12/10 01:24:56 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/12/10 01:24:56 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/12/10 01:24:56 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/12/10 01:24:56 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/12/10 01:24:56 INFO mapreduce.ImportJobBase: Beginning import of categories
16/12/10 01:24:56 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/12/10 01:24:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/12/10 01:24:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
16/12/10 01:24:59 INFO hive.metastore: Trying to connect to metastore with URI thrift://quickstart.cloudera:9083
16/12/10 01:24:59 INFO hive.metastore: Opened a connection to metastore, current connections: 1
16/12/10 01:25:00 INFO hive.metastore: Connected to metastore.
16/12/10 01:25:00 WARN mapreduce.DataDrivenImportJob: Target Hive table 'categories' exists! Sqoop will append data into the existing Hive table. Consider using --hive-overwrite, if you do NOT intend to do appending.
16/12/10 01:25:02 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
16/12/10 01:25:02 INFO client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/192.168.67.128:8032
16/12/10 01:25:04 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:05 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:06 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:07 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:08 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:09 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:10 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:11 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:12 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/10 01:25:13 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)

 

Any idea i am unsing cloudera quickstart vm 5.8 mysql was already installed and HDFS, hive are in green state.

7 REPLIES 7
Highlighted

Re: Exercise 1 Sqoop import fails with : Retrying connect to server:

Champion

Can u pls remove the following line (or) change the path and try again?

 

--warehouse-dir=/user/hive/warehouse

 

Thanks

Kumar

Highlighted

Re: Exercise 1 Sqoop import fails with : Retrying connect to server:

@saranvisatried both changing path and removing the line

--warehouse-dir=/user/hive/warehouse

same result. what is expected at port 8032 can you let me know it is not reachable even if i try in a browser?

 

16/12/13 11:35:58 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/13 11:35:59 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
16/12/13 11:36:00 INFO ipc.Client: Retrying connect to server: quickstart.cloudera/192.168.67.128:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)

Highlighted

Re: Exercise 1 Sqoop import fails with : Retrying connect to server:

Champion

@anotherrohit

 

I assume that there are more than one issue with the initial script and resolved one issue now (becuase the current issue is with source, port & connectivity).

 

Can you confirm your source is mysql? is the port open? if you not aware , pls check with your system admin and make sure the connectivity works fine between your source and your linux environment.

 

Also i would recommand you to try with import a particular small table instead of import all tables for the first time.  

 

Thanks

Kumar

Highlighted

Re: Exercise 1 Sqoop import fails with : Retrying connect to server:

@saranvisa mysql is present and am able to connect using mysql prompt on the same host(with default port 3306). However 8032 what serive runs on this port? the error states it is unable to connect using 8032 is this the port for hive. can you pleas let me know.

Highlighted

Re: Exercise 1 Sqoop import fails with : Retrying connect to server:

turns out YARN is needed to be running to sqoop import. Once YARN was started the error now shows something different. Still very unclear on what exception is seen

16/12/13 18:00:15 INFO client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/192.168.67.128:8032
16/12/13 18:00:15 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
16/12/13 18:00:15 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
16/12/13 18:00:15 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:830)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:826)
16/12/13 18:00:15 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:862)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:600)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:789)
16/12/13 18:00:15 WARN hdfs.DFSClient: Caught exception
java.lang.InterruptedException

Highlighted

Re: Exercise 1 Sqoop import fails with : Retrying connect to server:

Champion

@anotherrohit

 

Can you share your updated script... hope you made some minor changes on the initial version that you have shared

 

 

Highlighted

Re: Exercise 1 Sqoop import fails with : Retrying connect to server:

Champion

@anotherrohit

 

Try this


One of the pre-request for sqoop Import/Export is that you need to make sure Sqoop version is compatible with MySQL Connector jar version, otherwise it will not work... Meaning, "sudo yum install mysql-connector-java" will not help sometimes.

 

So try the below step after the yum install

1. Get the latest version of mysql connector 

# But Some time it will say already installed just by referring to old version. So it will not help us
sudo yum localinstall https://dev.mysql.com/get/mysql57-community-release-el6-9.noarch.rpm
sudo yum install mysql-connector-java 

 

2. Get the latest (or) suitable mysql-connector from the below link
http://dev.mysql.com/downloads/connector/j/5.1.html
wget http://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.40.tar.gz

3. tar zxvf mysql-connector-java-5.1.40.tar.gz

 

4. Copy the latest .jar to /usr/share/java/
sudo cp /home/kumar/mysql/mysql-connector-java-5.1.40/mysql-connector-java-5.1.40-bin.jar /usr/share/java/

 

5. If /usr/share/java path already has a link mysql-connector-java.jar and it is referring to any old mysql-connector. Ex: mysql-connector-java-5.1.31-bin.jar then remove the link mysql-connector-java.jar and make it refer to the newly copied mysql-connector-java-5.1.40-bin.jar

 

## Remove the old link

rm mysql-connector-java.jar 

## Create a link which refers to new .jar
ln -s /usr/share/java/mysql-connector-java-5.1.40-bin.jar mysql-connector-java.jar 
## Expected result: link should refer to latest version:

Ex: mysql-connector-java.jar -> /usr/share/java/mysql-connector-java-5.1.40-bin.jar

6. For more details, refer the below link (Topic: Installing the MySQL JDBC Driver)
http://www.cloudera.com/documentation/enterprise/5-3-x/topics/cm_ig_mysql.html

 

Thanks

Kumar

Don't have an account?
Coming from Hortonworks? Activate your account here