Support Questions

Find answers, ask questions, and share your expertise

Sqoop import data

avatar
Contributor

Hi,

Hoping someone can advise. I am playing around with Sqoop. I can list the databases using:

sqoop list-databases --connect jdbc:mysql://127.0.0.1:3306 --username hue --password 1111

And I can list the tables:

sqoop list-tables --connect "jdbc:mysql://127.0.0.1:3306/test" --username hue --password 1111

However, when I try an import, I get an error:

sqoop import \

--connect "jdbc:mysql://127.0.0.1:3306/test" \

--username hue --password 1111 \

--table testtbl \

--target-dir /user/guest/mysqlimport

The error is below. I am not sure why this code is causing an error. Does anyone have any ideas?

Regards

Rev

Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail.     
Please set $ACCUMULO_HOME to the root of your Accumulo installation.                    
16/01/20 17:38:31 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950           
16/01/20 17:38:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is 
insecure. Consider using -P instead.                                                    
16/01/20 17:38:31 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultse
t.                                                                                      
16/01/20 17:38:31 INFO tool.CodeGenTool: Beginning code generation                      
SLF4J: Class path contains multiple SLF4J bindings.                                     
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.
jar!/org/slf4j/impl/StaticLoggerBinder.class]                                           
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.
1.jar!/org/slf4j/impl/StaticLoggerBinder.class]                                         
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]                    
16/01/20 17:38:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tes
ttbl` AS t LIMIT 1                                                                      
16/01/20 17:38:31 ERROR manager.SqlManager: Error reading from database: java.sql.SQLExc
eption: Streaming result set com.mysql.jdbc.RowDataDynamic@4b1aa70c is still active. No 
statements may be issued when any streaming result sets are open and in use on a given c
onnection. Ensure that you have called .close() on any active streaming result sets befo
re attempting more queries.                                                             
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@4b1aa70c is st
ill active. No statements may be issued when any streaming result sets are open and in u
se on a given connection. Ensure that you have called .close() on any active streaming r
esult sets before attempting more queries.                                              
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)                
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)                
        at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)   
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)                        
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)                     
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)              
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)              
        at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)           
        at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)   
        at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)                 
        at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)    
        at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:
286)                                                                                    
        at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java
:241)                                                                                   
        at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)      
        at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)    
        at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1845)       
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)             
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)          
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)            
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                    
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                   
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                    
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                               
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                               
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)                                  
16/01/20 17:38:31 ERROR tool.ImportTool: Encountered IOException running import job: jav
a.io.IOException: No columns to generate for ClassWriter                                
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)             
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)          
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)            
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                    
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                   
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                    
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                               
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                               
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)                                  

1 ACCEPTED SOLUTION

avatar
Rising Star

@Revlin Abbi - That error is because of wrong version of mysql connector jar file.

ls -l /usr/share/java/mysql*

if you want to over come the problem.

please use --driver com.mysql.jdbc.Driver

it will solve the issues, but recommendation is to use the right version of mysql connector jar.

View solution in original post

16 REPLIES 16

avatar
Master Mentor

Run as user guest not root. @Revlin Abbi

avatar
Master Mentor

@Revlin Abbi

ERROR tool.ImportTool:EncounteredIOException running import job: org.apache.hadoop.

security.AccessControlException:Permission denied: user=root, access=WRITE, inode="/user/root/.staging

":hdfs:hdfs:drwxr-xr-x

fix:

su - hdfs

hdfs dfs -chown -R root:hdfs /user/root

exit

then run the job

avatar
Contributor

Hi, Just tried that:

[root@sandbox ~]# su - hdfs                                                                            
[hdfs@sandbox ~]$ hdfs dfs -chown -R root:hdfs /user/root                                              

but I get:

chown: `/user/root': No such file or directory

avatar
Master Mentor

sudo -u hdfs dfs -mkdir /user/root

Then chown @Revlin Abbi

avatar
Master Mentor

@Revlin Abbi

su - hdfs

hdfs dfs -mkdir -p /user/root

hdfs dfs -chown -R root:hdfs /user/root

avatar
Contributor

Perfect, thanks guys! It works now.

avatar
Master Mentor

@Revlin Abbi are you still having issues with this? Can you accept best answer or provide your own solution?