Reply
Highlighted
New Contributor
Posts: 3
Registered: ‎05-15-2018
Accepted Solution

unable to import data from mysql to sqoop to HDFS

I tried couple of times new to this can someone tell ma where i am going wrong i am having the following error:


[cloudera@quickstart ~]$ sqoop import --connect jdbc:mysql://localhost:3306/myfirsttutorial --username retail_dba --password cloudera --table mytable --target-dir /sqoop_im port_data -m 1
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/05/15 13:46:08 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
18/05/15 13:46:08 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/05/15 13:46:08 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
18/05/15 13:46:08 ERROR tool.BaseSqoopTool: Unrecognized argument: port_data
18/05/15 13:46:08 ERROR tool.BaseSqoopTool: Unrecognized argument: -m

Posts: 1,760
Kudos: 378
Solutions: 281
Registered: ‎07-31-2013

Re: unable to import data from mysql to sqoop to HDFS

Your target directory path has a space character in it that makes sqoop
think it's two different values being passed. Remove the space in the path
word 'im port's.
New Contributor
Posts: 3
Registered: ‎05-15-2018

Re: unable to import data from mysql to sqoop to HDFS

hello,

    Thank you for your reply,but still error is coming but its different.

 

[cloudera@quickstart ~]$ sqoop import --connect jdbc:mysql://localhost:3306/myfirsttutorial --username retail_dba --password cloudera --table mytable --target-dir /sqoop_import_data -m 1
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/05/15 14:33:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
18/05/15 14:33:02 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/05/15 14:33:03 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/05/15 14:33:03 INFO tool.CodeGenTool: Beginning code generation
18/05/15 14:33:05 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Access denied for user 'retail_dba'@'%' to database 'myfirsttutorial'
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Access denied for user 'retail_dba'@'%' to database 'myfirsttutorial'
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
 at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)
 at com.mysql.jdbc.Util.getInstance(Util.java:360)
 at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978)
 at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887)
 at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823)
 at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:870)
 at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4332)
 at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1258)
 at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2234)
 at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265)
 at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064)
 at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:790)
 at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:44)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
 at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)
 at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395)
 at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325)
 at java.sql.DriverManager.getConnection(DriverManager.java:571)
 at java.sql.DriverManager.getConnection(DriverManager.java:215)
 at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
 at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
 at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
 at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
 at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
 at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
 at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
 at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
 at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1858)
 at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1657)
 at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
 at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)
 at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
 at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
 at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
 at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
18/05/15 14:33:05 ERROR tool.ImportTool: Import failed: java.io.IOException: No columns to generate for ClassWriter
 at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1663)
 at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
 at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)
 at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
 at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
 at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
 at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
 at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

Posts: 1,760
Kudos: 378
Solutions: 281
Registered: ‎07-31-2013

Re: unable to import data from mysql to sqoop to HDFS

The DB is throwing back an error this time. The message is self explanatory
though:

Access denied for user 'retail_dba'@'%' to database 'myfirsttutorial'

Follow https://dev.mysql.com/doc/refman/5.5/en/grant.html
New Contributor
Posts: 2
Registered: ‎09-10-2018

Re: unable to import data from mysql to sqoop to HDFS

[cloudera@quickstart ~]$ sqoop import \
>   --connect jdbc:mysql://192.168.0.106:3306/retail_db \
>   --username root \
>   --password cloudera \
>   --table order_items \
>   --warehouse-dir /sqoop_import_retail_db
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/09/10 03:17:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
18/09/10 03:17:02 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/09/10 03:17:03 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/09/10 03:17:03 INFO tool.CodeGenTool: Beginning code generation
18/09/10 03:17:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `order_items` AS t LIMIT 1
18/09/10 03:17:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `order_items` AS t LIMIT 1
18/09/10 03:17:03 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-cloudera/compile/fc369afb75fc7de3e34641ae6ca1ff73/order_items.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/09/10 03:17:04 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/fc369afb75fc7de3e34641ae6ca1ff73/order_items.jar
18/09/10 03:17:04 WARN manager.MySQLManager: It looks like you are importing from mysql.
18/09/10 03:17:04 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
18/09/10 03:17:04 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
18/09/10 03:17:04 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
18/09/10 03:17:04 INFO mapreduce.ImportJobBase: Beginning import of order_items
18/09/10 03:17:04 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
18/09/10 03:17:05 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
18/09/10 03:17:05 INFO client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/192.168.0.106:8032
18/09/10 03:17:05 WARN ipc.Client: Failed to connect to server: quickstart.cloudera/192.168.0.106:8020: try once and fail.
java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:648)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:744)
        at org.apache.hadoop.ipc.Client$Connection.access$3000(Client.java:396)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1557)
        at org.apache.hadoop.ipc.Client.call(Client.java:1480)
        at org.apache.hadoop.ipc.Client.call(Client.java:1441)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
        at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:786)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:260)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
        at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2131)
        at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1265)
        at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1261)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1261)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418)
        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
        at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:270)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203)
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176)
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273)
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
        at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
18/09/10 03:17:05 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:java.net.ConnectException: Call From quickstart.cloudera/192.168.0.106 to quickstart.cloudera:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
18/09/10 03:17:05 ERROR tool.ImportTool: Import failed: java.net.ConnectException: Call From quickstart.cloudera/192.168.0.106 to quickstart.cloudera:8020 failed on connection exception: java.net.ConnectException: Connection refused;

I am getting connection refused error when running 

sqoop import \
--connect jdbc:mysql://192.168.0.106:3306/retail_db \
--username root \
--password cloudera \
--table order_items \
--warehouse-dir /home/maddy/sqoop_import/retail_db


Posts: 1,760
Kudos: 378
Solutions: 281
Registered: ‎07-31-2013

Re: unable to import data from mysql to sqoop to HDFS

Please open a new topic as your issue is unrelated to this topic. This helps keep issues separate and improves your search experience.

Effectively your issue is that your YARN Resource Manager is either (1) down, due to a crash explained in the /var/log/hadoop-yarn/*.out files, or (2) not serving on the external address that quickstart.cloudera runs on, for which you need to ensure that 'nslookup $(hostname -f)' resolves to the external address in your VM and not localhost/127.0.0.1.
Announcements
New solutions