Support Questions

Find answers, ask questions, and share your expertise

Importing database table from MYSQL to HIVE using SQOOP

avatar
Explorer

Hi,

    Basically, I am trying to import a database table from MySQL to hive but it turns out not importing to hive instead it gives me the following error. can anyone please help me with a solution which would be very much appreciated.

[cloudera@quickstart ~]$ sqoop import --connect jdbc:mysql://localhost:3306/sqoop --username root --password cloudera --table customer --hive-import
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
19/11/27 23:11:33 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
19/11/27 23:11:33 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/11/27 23:11:33 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
19/11/27 23:11:33 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
19/11/27 23:11:33 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/11/27 23:11:33 INFO tool.CodeGenTool: Beginning code generation
19/11/27 23:11:33 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer` AS t LIMIT 1
19/11/27 23:11:33 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@2bbfdaa4 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@2bbfdaa4 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)
at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:305)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1858)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1657)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
19/11/27 23:11:33 ERROR tool.ImportTool: Import failed: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1663)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

[cloudera@quickstart ~]$

1 ACCEPTED SOLUTION

avatar
Master Mentor

@lalprasanth 

 

can you add the Mysql driver portion to your command

 

$ sqoop import  --connect jdbc:mysql://localhost:3306/sqoop  --username root  --password cloudera --table customer  --hive-import  --driver com.mysql.jdbc.Driver

 

Please let me know 

 

View solution in original post

5 REPLIES 5

avatar
Master Mentor

@lalprasanth 

 

can you add the Mysql driver portion to your command

 

$ sqoop import  --connect jdbc:mysql://localhost:3306/sqoop  --username root  --password cloudera --table customer  --hive-import  --driver com.mysql.jdbc.Driver

 

Please let me know 

 

avatar
Explorer

@Shelton 

 

it gives me this response

clouderavel.JPG

avatar
Master Mentor

@lalprasanth 

That's obvious because you have double $$  in the sqoop command you forgot to remove  the one in the command I posted please do that and revert.

Thank you

avatar
Explorer

@Shelton 

 

It works fine now with this command, I also had to add split-by id 

 

sqoop import --connect jdbc:mysql://localhost:3306/sqoop --username root --password cloudera --split-by id --table customer --hive-import --driver com.mysql.jdbc.Driver

 

thank you so much

avatar
New Contributor

what worked for me is

 
sqoop import --connect jdbc:mysql://localhost:3306/classicmodels
--username root --password hadoop --split-by id --m 1 --table customers --hive-impo
rt --driver com.mysql.jdbc.Driver

 

just added --m 1