<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Sqoop import data in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105458#M68338</link>
    <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt;
&lt;/P&gt;&lt;P&gt;That error is because of wrong version of mysql connector jar file. &lt;/P&gt;&lt;P&gt;ls -l  /usr/share/java/mysql*&lt;/P&gt;&lt;P&gt;**update**&lt;/P&gt;&lt;P&gt;add this in your synatx&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;--driver com.mysql.jdbc.Driver&lt;/STRONG&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 21 Jan 2016 01:40:43 GMT</pubDate>
    <dc:creator>nsabharwal</dc:creator>
    <dc:date>2016-01-21T01:40:43Z</dc:date>
    <item>
      <title>Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105457#M68337</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Hoping someone can advise. I am playing around with Sqoop. I can list the databases using:&lt;/P&gt;&lt;P&gt;sqoop list-databases --connect jdbc:mysql://127.0.0.1:3306 --username hue --password 1111&lt;/P&gt;&lt;P&gt;And I can list the tables:&lt;/P&gt;&lt;P&gt;sqoop list-tables --connect "jdbc:mysql://127.0.0.1:3306/test" --username hue  --password 1111&lt;/P&gt;&lt;P&gt;However, when I try an import, I get an error:&lt;/P&gt;&lt;P&gt;sqoop import \ &lt;/P&gt;&lt;P&gt;--connect "jdbc:mysql://127.0.0.1:3306/test" \ &lt;/P&gt;&lt;P&gt;--username hue --password 1111 \ &lt;/P&gt;&lt;P&gt;--table testtbl \ &lt;/P&gt;&lt;P&gt;--target-dir /user/guest/mysqlimport&lt;/P&gt;&lt;P&gt;The error is below. I am not sure why this code is causing an error. Does anyone have any ideas?&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;&lt;P&gt;Rev&lt;/P&gt;&lt;PRE&gt;Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail.     
Please set $ACCUMULO_HOME to the root of your Accumulo installation.                    
16/01/20 17:38:31 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950           
16/01/20 17:38:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is 
insecure. Consider using -P instead.                                                    
16/01/20 17:38:31 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultse
t.                                                                                      
16/01/20 17:38:31 INFO tool.CodeGenTool: Beginning code generation                      
SLF4J: Class path contains multiple SLF4J bindings.                                     
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.
jar!/org/slf4j/impl/StaticLoggerBinder.class]                                           
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.
1.jar!/org/slf4j/impl/StaticLoggerBinder.class]                                         
SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]                    
16/01/20 17:38:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tes
ttbl` AS t LIMIT 1                                                                      
16/01/20 17:38:31 ERROR manager.SqlManager: Error reading from database: java.sql.SQLExc
eption: Streaming result set com.mysql.jdbc.RowDataDynamic@4b1aa70c is still active. No 
statements may be issued when any streaming result sets are open and in use on a given c
onnection. Ensure that you have called .close() on any active streaming result sets befo
re attempting more queries.                                                             
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@4b1aa70c is st
ill active. No statements may be issued when any streaming result sets are open and in u
se on a given connection. Ensure that you have called .close() on any active streaming r
esult sets before attempting more queries.                                              
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)                
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)                
        at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)   
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)                        
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)                     
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)              
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)              
        at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)           
        at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)   
        at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)                 
        at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)    
        at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:
286)                                                                                    
        at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java
:241)                                                                                   
        at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)      
        at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)    
        at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1845)       
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)             
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)          
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)            
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                    
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                   
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                    
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                               
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                               
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)                                  
16/01/20 17:38:31 ERROR tool.ImportTool: Encountered IOException running import job: jav
a.io.IOException: No columns to generate for ClassWriter                                
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)             
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)          
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)            
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                    
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                   
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                    
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                               
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                               
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)                                  

&lt;/PRE&gt;</description>
      <pubDate>Thu, 21 Jan 2016 01:39:05 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105457#M68337</guid>
      <dc:creator>Rebel</dc:creator>
      <dc:date>2016-01-21T01:39:05Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105458#M68338</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt;
&lt;/P&gt;&lt;P&gt;That error is because of wrong version of mysql connector jar file. &lt;/P&gt;&lt;P&gt;ls -l  /usr/share/java/mysql*&lt;/P&gt;&lt;P&gt;**update**&lt;/P&gt;&lt;P&gt;add this in your synatx&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;--driver com.mysql.jdbc.Driver&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 21 Jan 2016 01:40:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105458#M68338</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-01-21T01:40:43Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105459#M68339</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt;
&lt;/P&gt;&lt;P&gt;try this&lt;/P&gt;&lt;P&gt;sqoop import --connect jdbc:mysql://127.0.0.1:3306/test --username root --password root --table  t1  --driver com.mysql.jdbc.Driver&lt;/P&gt;&lt;P&gt;sqoop import \&lt;/P&gt;&lt;P&gt;--connect "jdbc:mysql://127.0.0.1:3306/test" \&lt;/P&gt;&lt;P&gt;--username hue --password 1111 \&lt;/P&gt;&lt;P&gt;--table testtbl \&lt;/P&gt;&lt;P&gt;--target-dir /user/guest/mysqlimport&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;--driver com.mysql.jdbc.Driver&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 21 Jan 2016 01:52:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105459#M68339</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-01-21T01:52:01Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105460#M68340</link>
      <description>&lt;P&gt;&lt;A href="https://community.hortonworks.com/users/2030/revlin.html"&gt;@Revlin Abbi&lt;/A&gt; - That error is because of wrong version of mysql connector jar file.&lt;/P&gt;&lt;P&gt;ls -l /usr/share/java/mysql*&lt;/P&gt;&lt;P&gt;if you want to over come the problem. &lt;/P&gt;&lt;P&gt;please use --driver com.mysql.jdbc.Driver &lt;/P&gt;&lt;P&gt;it will solve the issues, but recommendation is to use the right version of mysql connector jar.&lt;/P&gt;</description>
      <pubDate>Thu, 21 Jan 2016 01:54:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105460#M68340</guid>
      <dc:creator>jkotireddy</dc:creator>
      <dc:date>2016-01-21T01:54:02Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105461#M68341</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Thank you Neeraj and jkotireddy. I have tried the sqoop import statement with the driver line but still I get an error (pasted below). I will update the mysql driver and try again - but is there any reason why it still doesn't work with the driver line included?&lt;/P&gt;&lt;P&gt;sqoop import \ &lt;/P&gt;&lt;P&gt;--connect "jdbc:mysql://127.0.0.1:3306/test" \ &lt;/P&gt;&lt;P&gt;--username hue --password 1111 \ &lt;/P&gt;&lt;P&gt;--table testtbl \ &lt;/P&gt;&lt;P&gt;--target-dir /user/guest/mysqlimport \ &lt;/P&gt;&lt;P&gt;--driver com.mysql.jdbc.Driver&lt;/P&gt;&lt;PRE&gt;Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail.                    
Please set $ACCUMULO_HOME to the root of your Accumulo installation.                                   
16/01/21 00:29:12 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950                          
16/01/21 00:29:12 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consi
der using -P instead.                                                                                  
16/01/21 00:29:12 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appro
priate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to o
rg.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should b
e used next time.                                                                                      
16/01/21 00:29:12 INFO manager.SqlManager: Using default fetchSize of 1000                             
16/01/21 00:29:12 INFO tool.CodeGenTool: Beginning code generation                                     
SLF4J: Class path contains multiple SLF4J bindings.                                                    
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/
impl/StaticLoggerBinder.class]                                                                         
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4
j/impl/StaticLoggerBinder.class]                                                                       
SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]                                   
16/01/21 00:29:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 00:29:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 00:29:13 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.2.0-2950/hadoop-mapre
duce                                                                                                   
Note: /tmp/sqoop-root/compile/5e5baec496fc20389f12c27fbc094cd5/testtbl.java uses or overrides a depreca
ted API.                                                                                               
Note: Recompile with -Xlint:deprecation for details.                                                   
16/01/21 00:29:15 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/5e5baec496fc20
389f12c27fbc094cd5/testtbl.jar                                                                         
16/01/21 00:29:15 INFO mapreduce.ImportJobBase: Beginning import of testtbl                            
16/01/21 00:29:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 00:29:16 INFO impl.TimelineClientImpl: Timeline service address: &lt;A href="http://sandbox.hortonworks.co/"&gt;http://sandbox.hortonworks.co&lt;/A&gt;
m:8188/ws/v1/timeline/                                                                                 
16/01/21 00:29:16 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.
15:8050                                                                                                
16/01/21 00:29:17 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.
mapred.FileAlreadyExistsException: Output directory hdfs://sandbox.hortonworks.com:8020/user/guest/mysq
limport already exists                                                                                 
        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.ja
va:146)                                                                                                
        at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)                  
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)           
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)                                       
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)                                       
        at java.security.AccessController.doPrivileged(Native Method)                                  
        at javax.security.auth.Subject.doAs(Subject.java:415)                                          
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)        
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)                                       
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)                            
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)                
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)                     
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)                  
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)                        
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)                           
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                                   
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                                  
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                                   
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                                             
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                                              
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)    
&lt;/PRE&gt;</description>
      <pubDate>Thu, 21 Jan 2016 08:36:11 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105461#M68341</guid>
      <dc:creator>Rebel</dc:creator>
      <dc:date>2016-01-21T08:36:11Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105462#M68342</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt; it is a different error. It says output already exists, delete /user/guest/mysqlimport and try again or run your Sqoop with new output directory name.&lt;/P&gt;</description>
      <pubDate>Thu, 21 Jan 2016 09:19:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105462#M68342</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-01-21T09:19:48Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105463#M68343</link>
      <description>&lt;P&gt;Hi Artem, I have tried with another directory and I get the following error...&lt;/P&gt;&lt;P&gt;sqoop import \ &lt;/P&gt;&lt;P&gt;--connect "jdbc:mysql://127.0.0.1:3306/test" \ &lt;/P&gt;&lt;P&gt;--username hue --password 1111 \ &lt;/P&gt;&lt;P&gt;--table testtbl \ &lt;/P&gt;&lt;P&gt;--target-dir /user/guest/mysqlimport2 \ &lt;/P&gt;&lt;P&gt;--driver com.mysql.jdbc.Driver&lt;/P&gt;&lt;PRE&gt;Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail.                    
Please set $ACCUMULO_HOME to the root of your Accumulo installation.                                   
16/01/21 01:52:31 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950                          
16/01/21 01:52:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consi
der using -P instead.                                                                                  
16/01/21 01:52:31 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appro
priate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to o
rg.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should b
e used next time.                                                                                      
16/01/21 01:52:31 INFO manager.SqlManager: Using default fetchSize of 1000                             
16/01/21 01:52:31 INFO tool.CodeGenTool: Beginning code generation                                     
SLF4J: Class path contains multiple SLF4J bindings.                                                    
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/
impl/StaticLoggerBinder.class]                                                                         
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4
j/impl/StaticLoggerBinder.class]                                                                       
SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]                                   
16/01/21 01:52:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 01:52:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 01:52:32 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.2.0-2950/hadoop-mapre
duce                                                                                                   
Note: /tmp/sqoop-root/compile/89e6906e14bcf45371dcde0a398899e1/testtbl.java uses or overrides a depreca
ted API.                                                                                               
Note: Recompile with -Xlint:deprecation for details.                                                   
16/01/21 01:52:33 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/89e6906e14bcf4
5371dcde0a398899e1/testtbl.jar                                                                         
16/01/21 01:52:33 INFO mapreduce.ImportJobBase: Beginning import of testtbl                            
16/01/21 01:52:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 01:52:35 INFO impl.TimelineClientImpl: Timeline service address: &lt;A href="http://sandbox.hortonworks.co/"&gt;http://sandbox.hortonworks.co&lt;/A&gt;
m:8188/ws/v1/timeline/                                                                                 
16/01/21 01:52:35 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.
15:8050                                                                                                
16/01/21 01:52:35 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.
security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root/.staging
":hdfs:hdfs:drwxr-xr-x                                                                                 
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:31
9)                                                                                                     
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:29
2)                                                                                                     
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCheck
er.java:213)                                                                                           
        at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.chec
kPermission(RangerHdfsAuthorizer.java:300)                                                             
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCheck
er.java:190)                                                                                           
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771)   
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755)   
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738
)                                                                                                      
        at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)            
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896)          
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(Client
NamenodeProtocolServerSideTranslatorPB.java:622)                                                       
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.
callBlockingMethod(ClientNamenodeProtocolProtos.java)                                                  
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.jav
a:616)                                                                                                 
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)                                         
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137)                                
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133)                                
        at java.security.AccessController.doPrivileged(Native Method)                                  
        at javax.security.auth.Subject.doAs(Subject.java:415)                                          
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)        
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131)                                  

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)                       
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.
java:45)                                                                                               
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)                             
        at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)        
        at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)        
        at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3010)                        
        at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978)                                
        at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047)     
        at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043)     
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)         
        at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043)
        at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036)        
        at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)   
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144)           
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)                                       
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)                                       
        at java.security.AccessController.doPrivileged(Native Method)                                  
        at javax.security.auth.Subject.doAs(Subject.java:415)                                          
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)        
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)                                       
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)                            
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)                
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)                     
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)                  
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)                        
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)                           
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                                   
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                                  
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                                   
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                                             
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                                              
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)                                                 
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Pe
rmission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x             
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:31
9)                                                                                                     
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:29
2)                                                                                                     
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCheck
er.java:213)                                                                                           
        at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.chec
kPermission(RangerHdfsAuthorizer.java:300)                                                             
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCheck
er.java:190)                                                                                           
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771)   
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755)   
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738
)                                                                                                      
        at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)            
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896)          
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(Client
NamenodeProtocolServerSideTranslatorPB.java:622)                                                       
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.
callBlockingMethod(ClientNamenodeProtocolProtos.java)                                                  
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.jav
a:616)                                                                                                 
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)                                         
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137)                                
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133)                                
        at java.security.AccessController.doPrivileged(Native Method)                                  
        at javax.security.auth.Subject.doAs(Subject.java:415)                                          
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)        
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131)                                  

        at org.apache.hadoop.ipc.Client.call(Client.java:1427)                                         
        at org.apache.hadoop.ipc.Client.call(Client.java:1358)                                         
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)          
        at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)                                                
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodePr
otocolTranslatorPB.java:558)                                                                           
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)                                 
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)               
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)       
        at java.lang.reflect.Method.invoke(Method.java:606)                                            
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:1
87)                                                                                                    
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)   
        at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)                                               
        at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008)                        
        ... 27 more                                                                                    

&lt;/PRE&gt;</description>
      <pubDate>Thu, 21 Jan 2016 09:57:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105463#M68343</guid>
      <dc:creator>Rebel</dc:creator>
      <dc:date>2016-01-21T09:57:47Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105464#M68344</link>
      <description>&lt;P&gt;Run as user guest not root. &lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 21 Jan 2016 10:30:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105464#M68344</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-01-21T10:30:48Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105465#M68345</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt;&lt;/P&gt;&lt;OL&gt;
&lt;/OL&gt;&lt;P&gt;ERROR tool.ImportTool:EncounteredIOException running import job: org.apache.hadoop.&lt;/P&gt;&lt;P&gt;security.AccessControlException:Permission denied: user=root, access=WRITE, inode="/user/root/.staging&lt;/P&gt;&lt;P&gt;":hdfs:hdfs:drwxr-xr-x&lt;/P&gt;&lt;P&gt;fix:&lt;/P&gt;&lt;P&gt;su - hdfs&lt;/P&gt;&lt;P&gt;hdfs dfs -chown -R root:hdfs /user/root&lt;/P&gt;&lt;P&gt;exit&lt;/P&gt;&lt;P&gt;then run the job &lt;/P&gt;</description>
      <pubDate>Thu, 21 Jan 2016 12:45:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105465#M68345</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-01-21T12:45:14Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105466#M68346</link>
      <description>&lt;P&gt;Hi, Just tried that:&lt;/P&gt;&lt;PRE&gt;[root@sandbox ~]# su - hdfs                                                                            
[hdfs@sandbox ~]$ hdfs dfs -chown -R root:hdfs /user/root                                              
&lt;/PRE&gt;&lt;P&gt;but I get:&lt;/P&gt;&lt;P&gt;chown: `/user/root': No such file or directory &lt;/P&gt;</description>
      <pubDate>Fri, 22 Jan 2016 20:01:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105466#M68346</guid>
      <dc:creator>Rebel</dc:creator>
      <dc:date>2016-01-22T20:01:46Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105467#M68347</link>
      <description>&lt;P&gt;sudo -u hdfs dfs -mkdir /user/root &lt;/P&gt;&lt;P&gt;Then chown &lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 22 Jan 2016 20:11:56 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105467#M68347</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-01-22T20:11:56Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105468#M68348</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt;&lt;/P&gt;&lt;P&gt;su - hdfs&lt;/P&gt;&lt;P&gt;hdfs dfs -mkdir -p /user/root&lt;/P&gt;&lt;P&gt;hdfs dfs -chown -R root:hdfs /user/root &lt;/P&gt;</description>
      <pubDate>Fri, 22 Jan 2016 20:14:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105468#M68348</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-01-22T20:14:08Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105469#M68349</link>
      <description>&lt;P&gt;Perfect, thanks guys! It works now.&lt;/P&gt;</description>
      <pubDate>Fri, 22 Jan 2016 20:36:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105469#M68349</guid>
      <dc:creator>Rebel</dc:creator>
      <dc:date>2016-01-22T20:36:19Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105470#M68350</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt; are you still having issues with this? Can you accept best answer or provide your own solution?&lt;/P&gt;</description>
      <pubDate>Wed, 03 Feb 2016 03:19:05 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105470#M68350</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-02-03T03:19:05Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105471#M68351</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2030/revlin.html" nodeid="2030"&gt;@Revlin Abbi&lt;/A&gt;  Accepting this as best answer &lt;/P&gt;</description>
      <pubDate>Fri, 19 Feb 2016 21:43:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105471#M68351</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-02-19T21:43:16Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105472#M68352</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/393/aervits.html" nodeid="393"&gt;@Artem Ervits&lt;/A&gt;  It's related to wrong version of mysql connector. FYI &lt;/P&gt;</description>
      <pubDate>Fri, 19 Feb 2016 21:47:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/105472#M68352</guid>
      <dc:creator>nsabharwal</dc:creator>
      <dc:date>2016-02-19T21:47:26Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/380161#M244006</link>
      <description>&lt;P&gt;¡Muchas gracias!, me ayudó mucho esta solución. Ahora ya pude realizar mi tarea. Bendiciones&lt;/P&gt;</description>
      <pubDate>Sun, 03 Dec 2023 02:53:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data/m-p/380161#M244006</guid>
      <dc:creator>IsabelBenitesRo</dc:creator>
      <dc:date>2023-12-03T02:53:22Z</dc:date>
    </item>
  </channel>
</rss>

