- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Created on ‎11-25-2015 07:01 PM
When working with HDP 2.3.2 sandbox... and running standard sqoop job which moves data from MySQL to Hive.. I ran into the following error:
15/11/25 17:00:30 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@135b2072 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@135b2072 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
This was overcome by adding --driver com.mysql.jdbc.Driver to the end of my sqoop create job command, as such:
sqoop job -create job1 --import --verbose --connect 'jdbc:mysql://localhost/db1' --table table1 --username root --password-file hdfs://sandbox.hortonworks.com:8020/user/me/mysqlpasswd.txt --hcatalog-table job1 -m 1 --check-column lastupdate --incremental lastmodified --last-value '1900-01-01' --driver com.mysql.jdbc.Driver
the --driver parameter forces sqoop to use the latest mysql-connector-java.jar installed for mysql db on the sqoop machine. For this 2.3.2 sandbox, without this paramater, the sqoop job, when executed, will fail as mentioned.
Created on ‎12-16-2015 08:40 PM
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
"sqoop import" command may work for some table, but when you try "sqoop import-all-tables", you may see the above error.
Created on ‎06-16-2016 05:35 AM
- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Same issue Mark reported on HDP 2.4 Sandbox using sqoop import on a single table.
Example command
sqoop import --connect jdbc:mysql://192.168.1.17:3306/test --username drice --password hadoop --table client --hive-table default.client --hive-import -m 1
NOTE Marks workaround worked
new command
sqoop import --connect jdbc:mysql://192.168.1.17:3306/test --username drice --password hadoop --table client --hive-table default.client --hive-import -m 1 --driver com.mysql.jdbc.Driver