Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Sqoop import data

avatar
Visitor

Hi,

Hoping someone can advise. I am playing around with Sqoop. I can list the databases using:

sqoop list-databases --connect jdbc:mysql://127.0.0.1:3306 --username hue --password 1111

And I can list the tables:

sqoop list-tables --connect "jdbc:mysql://127.0.0.1:3306/test" --username hue --password 1111

However, when I try an import, I get an error:

sqoop import \

--connect "jdbc:mysql://127.0.0.1:3306/test" \

--username hue --password 1111 \

--table testtbl \

--target-dir /user/guest/mysqlimport

The error is below. I am not sure why this code is causing an error. Does anyone have any ideas?

Regards

Rev

Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail.     
Please set $ACCUMULO_HOME to the root of your Accumulo installation.                    
16/01/20 17:38:31 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950           
16/01/20 17:38:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is 
insecure. Consider using -P instead.                                                    
16/01/20 17:38:31 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultse
t.                                                                                      
16/01/20 17:38:31 INFO tool.CodeGenTool: Beginning code generation                      
SLF4J: Class path contains multiple SLF4J bindings.                                     
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.
jar!/org/slf4j/impl/StaticLoggerBinder.class]                                           
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.
1.jar!/org/slf4j/impl/StaticLoggerBinder.class]                                         
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]                    
16/01/20 17:38:31 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tes
ttbl` AS t LIMIT 1                                                                      
16/01/20 17:38:31 ERROR manager.SqlManager: Error reading from database: java.sql.SQLExc
eption: Streaming result set com.mysql.jdbc.RowDataDynamic@4b1aa70c is still active. No 
statements may be issued when any streaming result sets are open and in use on a given c
onnection. Ensure that you have called .close() on any active streaming result sets befo
re attempting more queries.                                                             
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@4b1aa70c is st
ill active. No statements may be issued when any streaming result sets are open and in u
se on a given connection. Ensure that you have called .close() on any active streaming r
esult sets before attempting more queries.                                              
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)                
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)                
        at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)   
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)                        
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)                     
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)              
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)              
        at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)           
        at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)   
        at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)                 
        at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)    
        at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:
286)                                                                                    
        at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java
:241)                                                                                   
        at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)      
        at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)    
        at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1845)       
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)             
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)          
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)            
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                    
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                   
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                    
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                               
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                               
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)                                  
16/01/20 17:38:31 ERROR tool.ImportTool: Encountered IOException running import job: jav
a.io.IOException: No columns to generate for ClassWriter                                
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)             
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)          
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)            
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                    
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                   
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                    
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                               
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                               
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)                                  

1 ACCEPTED SOLUTION

avatar
Rising Star

@Revlin Abbi - That error is because of wrong version of mysql connector jar file.

ls -l /usr/share/java/mysql*

if you want to over come the problem.

please use --driver com.mysql.jdbc.Driver

it will solve the issues, but recommendation is to use the right version of mysql connector jar.

View solution in original post

16 REPLIES 16

avatar
Master Mentor

@Revlin Abbi

That error is because of wrong version of mysql connector jar file.

ls -l /usr/share/java/mysql*

**update**

add this in your synatx

--driver com.mysql.jdbc.Driver

avatar
Master Mentor

@Revlin Abbi

try this

sqoop import --connect jdbc:mysql://127.0.0.1:3306/test --username root --password root --table t1 --driver com.mysql.jdbc.Driver

sqoop import \

--connect "jdbc:mysql://127.0.0.1:3306/test" \

--username hue --password 1111 \

--table testtbl \

--target-dir /user/guest/mysqlimport

--driver com.mysql.jdbc.Driver

avatar

¡Muchas gracias!, me ayudó mucho esta solución. Ahora ya pude realizar mi tarea. Bendiciones

avatar
Rising Star

@Revlin Abbi - That error is because of wrong version of mysql connector jar file.

ls -l /usr/share/java/mysql*

if you want to over come the problem.

please use --driver com.mysql.jdbc.Driver

it will solve the issues, but recommendation is to use the right version of mysql connector jar.

avatar
Master Mentor

@Revlin Abbi Accepting this as best answer

avatar
Visitor

Hi,

Thank you Neeraj and jkotireddy. I have tried the sqoop import statement with the driver line but still I get an error (pasted below). I will update the mysql driver and try again - but is there any reason why it still doesn't work with the driver line included?

sqoop import \

--connect "jdbc:mysql://127.0.0.1:3306/test" \

--username hue --password 1111 \

--table testtbl \

--target-dir /user/guest/mysqlimport \

--driver com.mysql.jdbc.Driver

Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail.                    
Please set $ACCUMULO_HOME to the root of your Accumulo installation.                                   
16/01/21 00:29:12 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950                          
16/01/21 00:29:12 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consi
der using -P instead.                                                                                  
16/01/21 00:29:12 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appro
priate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to o
rg.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should b
e used next time.                                                                                      
16/01/21 00:29:12 INFO manager.SqlManager: Using default fetchSize of 1000                             
16/01/21 00:29:12 INFO tool.CodeGenTool: Beginning code generation                                     
SLF4J: Class path contains multiple SLF4J bindings.                                                    
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/
impl/StaticLoggerBinder.class]                                                                         
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4
j/impl/StaticLoggerBinder.class]                                                                       
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]                                   
16/01/21 00:29:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 00:29:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 00:29:13 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.2.0-2950/hadoop-mapre
duce                                                                                                   
Note: /tmp/sqoop-root/compile/5e5baec496fc20389f12c27fbc094cd5/testtbl.java uses or overrides a depreca
ted API.                                                                                               
Note: Recompile with -Xlint:deprecation for details.                                                   
16/01/21 00:29:15 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/5e5baec496fc20
389f12c27fbc094cd5/testtbl.jar                                                                         
16/01/21 00:29:15 INFO mapreduce.ImportJobBase: Beginning import of testtbl                            
16/01/21 00:29:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 00:29:16 INFO impl.TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.co
m:8188/ws/v1/timeline/                                                                                 
16/01/21 00:29:16 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.
15:8050                                                                                                
16/01/21 00:29:17 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.
mapred.FileAlreadyExistsException: Output directory hdfs://sandbox.hortonworks.com:8020/user/guest/mysq
limport already exists                                                                                 
        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.ja
va:146)                                                                                                
        at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)                  
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)           
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)                                       
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)                                       
        at java.security.AccessController.doPrivileged(Native Method)                                  
        at javax.security.auth.Subject.doAs(Subject.java:415)                                          
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)        
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)                                       
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)                            
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)                
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)                     
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)                  
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)                        
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)                           
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                                   
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                                  
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                                   
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                                             
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                                              
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)    

avatar
Master Mentor

@Revlin Abbi it is a different error. It says output already exists, delete /user/guest/mysqlimport and try again or run your Sqoop with new output directory name.

avatar
Master Mentor

@Artem Ervits It's related to wrong version of mysql connector. FYI

avatar
Visitor

Hi Artem, I have tried with another directory and I get the following error...

sqoop import \

--connect "jdbc:mysql://127.0.0.1:3306/test" \

--username hue --password 1111 \

--table testtbl \

--target-dir /user/guest/mysqlimport2 \

--driver com.mysql.jdbc.Driver

Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo imports will fail.                    
Please set $ACCUMULO_HOME to the root of your Accumulo installation.                                   
16/01/21 01:52:31 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950                          
16/01/21 01:52:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consi
der using -P instead.                                                                                  
16/01/21 01:52:31 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appro
priate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to o
rg.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should b
e used next time.                                                                                      
16/01/21 01:52:31 INFO manager.SqlManager: Using default fetchSize of 1000                             
16/01/21 01:52:31 INFO tool.CodeGenTool: Beginning code generation                                     
SLF4J: Class path contains multiple SLF4J bindings.                                                    
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/
impl/StaticLoggerBinder.class]                                                                         
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.2.0-2950/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4
j/impl/StaticLoggerBinder.class]                                                                       
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]                                   
16/01/21 01:52:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 01:52:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 01:52:32 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.2.0-2950/hadoop-mapre
duce                                                                                                   
Note: /tmp/sqoop-root/compile/89e6906e14bcf45371dcde0a398899e1/testtbl.java uses or overrides a depreca
ted API.                                                                                               
Note: Recompile with -Xlint:deprecation for details.                                                   
16/01/21 01:52:33 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/89e6906e14bcf4
5371dcde0a398899e1/testtbl.jar                                                                         
16/01/21 01:52:33 INFO mapreduce.ImportJobBase: Beginning import of testtbl                            
16/01/21 01:52:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testtbl AS t WHERE 
1=0                                                                                                    
16/01/21 01:52:35 INFO impl.TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.co
m:8188/ws/v1/timeline/                                                                                 
16/01/21 01:52:35 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.
15:8050                                                                                                
16/01/21 01:52:35 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.
security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root/.staging
":hdfs:hdfs:drwxr-xr-x                                                                                 
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:31
9)                                                                                                     
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:29
2)                                                                                                     
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCheck
er.java:213)                                                                                           
        at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.chec
kPermission(RangerHdfsAuthorizer.java:300)                                                             
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCheck
er.java:190)                                                                                           
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771)   
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755)   
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738
)                                                                                                      
        at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)            
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896)          
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(Client
NamenodeProtocolServerSideTranslatorPB.java:622)                                                       
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.
callBlockingMethod(ClientNamenodeProtocolProtos.java)                                                  
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.jav
a:616)                                                                                                 
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)                                         
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137)                                
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133)                                
        at java.security.AccessController.doPrivileged(Native Method)                                  
        at javax.security.auth.Subject.doAs(Subject.java:415)                                          
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)        
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131)                                  

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)                       
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.
java:45)                                                                                               
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)                             
        at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)        
        at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)        
        at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3010)                        
        at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978)                                
        at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047)     
        at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043)     
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)         
        at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043)
        at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036)        
        at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)   
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144)           
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)                                       
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)                                       
        at java.security.AccessController.doPrivileged(Native Method)                                  
        at javax.security.auth.Subject.doAs(Subject.java:415)                                          
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)        
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)                                       
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)                            
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)                
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)                     
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)                  
        at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)                        
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)                           
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)                                   
        at org.apache.sqoop.Sqoop.run(Sqoop.java:148)                                                  
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                                   
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)                                             
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)                                              
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)                                              
        at org.apache.sqoop.Sqoop.main(Sqoop.java:244)                                                 
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Pe
rmission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x             
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:31
9)                                                                                                     
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:29
2)                                                                                                     
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCheck
er.java:213)                                                                                           
        at org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.chec
kPermission(RangerHdfsAuthorizer.java:300)                                                             
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCheck
er.java:190)                                                                                           
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771)   
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755)   
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738
)                                                                                                      
        at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)            
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896)          
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(Client
NamenodeProtocolServerSideTranslatorPB.java:622)                                                       
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.
callBlockingMethod(ClientNamenodeProtocolProtos.java)                                                  
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.jav
a:616)                                                                                                 
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)                                         
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137)                                
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133)                                
        at java.security.AccessController.doPrivileged(Native Method)                                  
        at javax.security.auth.Subject.doAs(Subject.java:415)                                          
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)        
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131)                                  

        at org.apache.hadoop.ipc.Client.call(Client.java:1427)                                         
        at org.apache.hadoop.ipc.Client.call(Client.java:1358)                                         
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)          
        at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)                                                
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodePr
otocolTranslatorPB.java:558)                                                                           
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)                                 
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)               
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)       
        at java.lang.reflect.Method.invoke(Method.java:606)                                            
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:1
87)                                                                                                    
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)   
        at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)                                               
        at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008)                        
        ... 27 more