<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: unable to import data from mysql to sqoop to HDFS in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67310#M78306</link>
    <description>Your target directory path has a space character in it that makes sqoop&lt;BR /&gt;think it's two different values being passed. Remove the space in the path&lt;BR /&gt;word 'im port's.&lt;BR /&gt;</description>
    <pubDate>Tue, 15 May 2018 21:05:52 GMT</pubDate>
    <dc:creator>Harsh J</dc:creator>
    <dc:date>2018-05-15T21:05:52Z</dc:date>
    <item>
      <title>unable to import data from mysql to sqoop to HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67309#M78305</link>
      <description>&lt;P&gt;I tried couple of times new to this can someone tell ma where i am going wrong i am having the following error:&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;[cloudera@quickstart ~]$ sqoop import --connect jdbc:mysql://localhost:3306/myfirsttutorial --username retail_dba --password cloudera --table mytable --target-dir /sqoop_im port_data -m 1&lt;BR /&gt;Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.&lt;BR /&gt;Please set $ACCUMULO_HOME to the root of your Accumulo installation.&lt;BR /&gt;18/05/15 13:46:08 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0&lt;BR /&gt;18/05/15 13:46:08 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;BR /&gt;18/05/15 13:46:08 ERROR tool.BaseSqoopTool: Error parsing arguments for import:&lt;BR /&gt;18/05/15 13:46:08 ERROR tool.BaseSqoopTool: Unrecognized argument: port_data&lt;BR /&gt;18/05/15 13:46:08 ERROR tool.BaseSqoopTool: Unrecognized argument: -m&lt;/P&gt;</description>
      <pubDate>Tue, 15 May 2018 20:59:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67309#M78305</guid>
      <dc:creator>Samia</dc:creator>
      <dc:date>2018-05-15T20:59:09Z</dc:date>
    </item>
    <item>
      <title>Re: unable to import data from mysql to sqoop to HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67310#M78306</link>
      <description>Your target directory path has a space character in it that makes sqoop&lt;BR /&gt;think it's two different values being passed. Remove the space in the path&lt;BR /&gt;word 'im port's.&lt;BR /&gt;</description>
      <pubDate>Tue, 15 May 2018 21:05:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67310#M78306</guid>
      <dc:creator>Harsh J</dc:creator>
      <dc:date>2018-05-15T21:05:52Z</dc:date>
    </item>
    <item>
      <title>Re: unable to import data from mysql to sqoop to HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67311#M78307</link>
      <description>&lt;P&gt;hello,&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; Thank you for your reply,but still error is coming but its different.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[cloudera@quickstart ~]$ sqoop import --connect jdbc:mysql://localhost:3306/myfirsttutorial --username retail_dba --password cloudera --table mytable --target-dir /sqoop_import_data -m 1&lt;BR /&gt;Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.&lt;BR /&gt;Please set $ACCUMULO_HOME to the root of your Accumulo installation.&lt;BR /&gt;18/05/15 14:33:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0&lt;BR /&gt;18/05/15 14:33:02 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;BR /&gt;18/05/15 14:33:03 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.&lt;BR /&gt;18/05/15 14:33:03 INFO tool.CodeGenTool: Beginning code generation&lt;BR /&gt;18/05/15 14:33:05 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Access denied for user 'retail_dba'@'%' to database 'myfirsttutorial'&lt;BR /&gt;com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Access denied for user 'retail_dba'@'%' to database 'myfirsttutorial'&lt;BR /&gt;&amp;nbsp;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;&amp;nbsp;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;&amp;nbsp;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;&amp;nbsp;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.Util.getInstance(Util.java:360)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:870)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4332)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1258)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2234)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2265)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2064)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.ConnectionImpl.&amp;lt;init&amp;gt;(ConnectionImpl.java:790)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.JDBC4Connection.&amp;lt;init&amp;gt;(JDBC4Connection.java:44)&lt;BR /&gt;&amp;nbsp;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;&amp;nbsp;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;&amp;nbsp;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;&amp;nbsp;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:395)&lt;BR /&gt;&amp;nbsp;at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:325)&lt;BR /&gt;&amp;nbsp;at java.sql.DriverManager.getConnection(DriverManager.java:571)&lt;BR /&gt;&amp;nbsp;at java.sql.DriverManager.getConnection(DriverManager.java:215)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1858)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1657)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.run(Sqoop.java:147)&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.main(Sqoop.java:252)&lt;BR /&gt;18/05/15 14:33:05 ERROR tool.ImportTool: Import failed: java.io.IOException: No columns to generate for ClassWriter&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1663)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.run(Sqoop.java:147)&lt;BR /&gt;&amp;nbsp;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)&lt;BR /&gt;&amp;nbsp;at org.apache.sqoop.Sqoop.main(Sqoop.java:252)&lt;/P&gt;</description>
      <pubDate>Tue, 15 May 2018 21:38:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67311#M78307</guid>
      <dc:creator>Samia</dc:creator>
      <dc:date>2018-05-15T21:38:26Z</dc:date>
    </item>
    <item>
      <title>Re: unable to import data from mysql to sqoop to HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67314#M78308</link>
      <description>The DB is throwing back an error this time. The message is self explanatory&lt;BR /&gt;though:&lt;BR /&gt;&lt;BR /&gt;Access denied for user 'retail_dba'@'%' to database 'myfirsttutorial'&lt;BR /&gt;&lt;BR /&gt;Follow &lt;A href="https://dev.mysql.com/doc/refman/5.5/en/grant.html" target="_blank"&gt;https://dev.mysql.com/doc/refman/5.5/en/grant.html&lt;/A&gt;&lt;BR /&gt;</description>
      <pubDate>Tue, 15 May 2018 23:47:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/67314#M78308</guid>
      <dc:creator>Harsh J</dc:creator>
      <dc:date>2018-05-15T23:47:52Z</dc:date>
    </item>
    <item>
      <title>Re: unable to import data from mysql to sqoop to HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/79576#M78309</link>
      <description>&lt;PRE&gt;[cloudera@quickstart ~]$ sqoop import \
&amp;gt;   --connect jdbc:mysql://192.168.0.106:3306/retail_db \
&amp;gt;   --username root \
&amp;gt;   --password cloudera \
&amp;gt;   --table order_items \
&amp;gt;   --warehouse-dir /sqoop_import_retail_db
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/09/10 03:17:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
18/09/10 03:17:02 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/09/10 03:17:03 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/09/10 03:17:03 INFO tool.CodeGenTool: Beginning code generation
18/09/10 03:17:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `order_items` AS t LIMIT 1
18/09/10 03:17:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `order_items` AS t LIMIT 1
18/09/10 03:17:03 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-cloudera/compile/fc369afb75fc7de3e34641ae6ca1ff73/order_items.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/09/10 03:17:04 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/fc369afb75fc7de3e34641ae6ca1ff73/order_items.jar
18/09/10 03:17:04 WARN manager.MySQLManager: It looks like you are importing from mysql.
18/09/10 03:17:04 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
18/09/10 03:17:04 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
18/09/10 03:17:04 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
18/09/10 03:17:04 INFO mapreduce.ImportJobBase: Beginning import of order_items
18/09/10 03:17:04 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
18/09/10 03:17:05 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
18/09/10 03:17:05 INFO client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/192.168.0.106:8032
18/09/10 03:17:05 WARN ipc.Client: Failed to connect to server: quickstart.cloudera/192.168.0.106:8020: try once and fail.
java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:648)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:744)
        at org.apache.hadoop.ipc.Client$Connection.access$3000(Client.java:396)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1557)
        at org.apache.hadoop.ipc.Client.call(Client.java:1480)
        at org.apache.hadoop.ipc.Client.call(Client.java:1441)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
        at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:786)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:260)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
        at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2131)
        at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1265)
        at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1261)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1261)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418)
        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
        at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:270)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203)
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176)
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273)
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
        at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
18/09/10 03:17:05 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:java.net.ConnectException: Call From quickstart.cloudera/192.168.0.106 to quickstart.cloudera:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
18/09/10 03:17:05 ERROR tool.ImportTool: Import failed: java.net.ConnectException: Call From quickstart.cloudera/192.168.0.106 to quickstart.cloudera:8020 failed on connection exception: java.net.ConnectException: Connection refused;&lt;/PRE&gt;&lt;P&gt;I am getting connection refused error when running&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;sqoop import \&lt;BR /&gt;--connect jdbc:mysql://192.168.0.106:3306/retail_db \&lt;BR /&gt;--username root \&lt;BR /&gt;--password cloudera \&lt;BR /&gt;--table order_items \&lt;BR /&gt;--warehouse-dir /home/maddy/sqoop_import/retail_db&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 10 Sep 2018 10:22:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/79576#M78309</guid>
      <dc:creator>Maddy</dc:creator>
      <dc:date>2018-09-10T10:22:46Z</dc:date>
    </item>
    <item>
      <title>Re: unable to import data from mysql to sqoop to HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/79629#M78310</link>
      <description>Please open a new topic as your issue is unrelated to this topic. This helps keep issues separate and improves your search experience.&lt;BR /&gt;&lt;BR /&gt;Effectively your issue is that your YARN Resource Manager is either (1) down, due to a crash explained in the /var/log/hadoop-yarn/*.out files, or (2) not serving on the external address that quickstart.cloudera runs on, for which you need to ensure that 'nslookup $(hostname -f)' resolves to the external address in your VM and not localhost/127.0.0.1.</description>
      <pubDate>Tue, 11 Sep 2018 04:55:24 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/unable-to-import-data-from-mysql-to-sqoop-to-HDFS/m-p/79629#M78310</guid>
      <dc:creator>Harsh J</dc:creator>
      <dc:date>2018-09-11T04:55:24Z</dc:date>
    </item>
  </channel>
</rss>

