Community Articles

Find and share helpful community-sourced technical articles.
Announcements
Celebrating as our community reaches 100,000 members! Thank you!
Labels (2)
avatar
Explorer

When working with HDP 2.3.2 sandbox... and running standard sqoop job which moves data from MySQL to Hive.. I ran into the following error:

15/11/25 17:00:30 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@135b2072 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@135b2072 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

This was overcome by adding --driver com.mysql.jdbc.Driver to the end of my sqoop create job command, as such:

sqoop job -create job1 --import --verbose --connect 'jdbc:mysql://localhost/db1' --table table1 --username root --password-file hdfs://sandbox.hortonworks.com:8020/user/me/mysqlpasswd.txt --hcatalog-table job1 -m 1 --check-column lastupdate --incremental lastmodified --last-value '1900-01-01' --driver com.mysql.jdbc.Driver

the --driver parameter forces sqoop to use the latest mysql-connector-java.jar installed for mysql db on the sqoop machine. For this 2.3.2 sandbox, without this paramater, the sqoop job, when executed, will fail as mentioned.

4,305 Views
Comments

"sqoop import" command may work for some table, but when you try "sqoop import-all-tables", you may see the above error.

Same issue Mark reported on HDP 2.4 Sandbox using sqoop import on a single table.

Example command

sqoop import --connect jdbc:mysql://192.168.1.17:3306/test --username drice --password hadoop --table client --hive-table default.client --hive-import -m 1

NOTE Marks workaround worked

new command

sqoop import --connect jdbc:mysql://192.168.1.17:3306/test --username drice --password hadoop --table client --hive-table default.client --hive-import -m 1 --driver com.mysql.jdbc.Driver