Support Questions

Find answers, ask questions, and share your expertise

sqoop import error

avatar
Contributor

sqoop import --connect jdbc:mysql:///gaian-lap386.com/ambari1 --username ambari1 --table emp --m 1
/usr/hdp/3.1.0.0-78/hadoop/libexec/hadoop-functions.sh: line 2363: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: bad substitution
/usr/hdp/3.1.0.0-78/hadoop/libexec/hadoop-functions.sh: line 2458: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: bad substitution
19/08/28 17:20:25 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
19/08/28 17:20:25 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/08/28 17:20:25 INFO tool.CodeGenTool: Beginning code generation
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
19/08/28 17:20:26 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'ambari1'@'localhost' (using password: NO)
java.sql.SQLException: Access denied for user 'ambari1'@'localhost' (using password: NO)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:129)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:827)
at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:447)
at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:237)
at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:199)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:59)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1872)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1671)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
19/08/28 17:20:26 ERROR tool.ImportTool: Import failed: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1677)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

2 ACCEPTED SOLUTIONS

avatar
Master Mentor

@Manoj690 

 

As i mentioned in the other thread opened by you for the similar error.

https://community.cloudera.com/t5/Support-Questions/Sqoop-jdbc-error/m-p/269108#M206636

 

 

As your latest error is 

 

ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'ambari1'@'localhost' (using password: NO)

 

.

Which means you will need to make sure  following 2 things:

1. First thing is that you are entering correct password for "ambari1"  user in the sqoop command . using "--password" option.

2.  Also make sure to add the Grant as following for the root user in MySQL DB so that it can connect to mysql from the host where you are running the sqoop command.

Example:

 

mysql> GRANT ALL PRIVILEGES ON *.* TO  'ambari1'@'localhost' IDENTIFIED BY 'XXXXXXXXXX' WITH GRANT OPTION;
mysql> FLUSH PRIVILEGES;

 

Please replace the "XXXXXXXXXX' in the above command with your ambari1 password.

 

Please share the output of the following command shows the entry for 'amabri1' user with host 'localhost'

 

# mysql -u root -p
Enter Password: <YOUR_PASSWORD>
mysql> use mysql;
mysql> select user, host FROM user;

 

 

View solution in original post

avatar
Master Mentor

@Manoj690 

 

Looks like your previous error is resolved.   For the new error better you shoudl have opened a new thread to avoid confusion for other readers of the thread.

 

I see the new error is HDFS related (and completely unrelated to Mysql related error which you originally reported to this thread)

 

19/08/28 18:06:43 ERROR tool.ImportTool: Import failed: org.apache.hadoop.security.AccessControlException: Permission denied: user=gaian, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x

 

 

For the HDFS error you will need to make sure that the user  "gaian" who is running the sqoop job does not have thw write permission on . "/user" directory.

You might be able fix it by doing this:

 

 

# su - hdfs
# hdfs dfs -mkdir /user/gaian
# hdfs dfs -chown -R gaian:hadoop /user/gaian
# hdfs dfs -chmod -R 755 /user/gaian

 

.

If your question is answered then, Please make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

View solution in original post

5 REPLIES 5

avatar
Master Mentor

@Manoj690 

 

As i mentioned in the other thread opened by you for the similar error.

https://community.cloudera.com/t5/Support-Questions/Sqoop-jdbc-error/m-p/269108#M206636

 

 

As your latest error is 

 

ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'ambari1'@'localhost' (using password: NO)

 

.

Which means you will need to make sure  following 2 things:

1. First thing is that you are entering correct password for "ambari1"  user in the sqoop command . using "--password" option.

2.  Also make sure to add the Grant as following for the root user in MySQL DB so that it can connect to mysql from the host where you are running the sqoop command.

Example:

 

mysql> GRANT ALL PRIVILEGES ON *.* TO  'ambari1'@'localhost' IDENTIFIED BY 'XXXXXXXXXX' WITH GRANT OPTION;
mysql> FLUSH PRIVILEGES;

 

Please replace the "XXXXXXXXXX' in the above command with your ambari1 password.

 

Please share the output of the following command shows the entry for 'amabri1' user with host 'localhost'

 

# mysql -u root -p
Enter Password: <YOUR_PASSWORD>
mysql> use mysql;
mysql> select user, host FROM user;

 

 

avatar
Contributor

sqoop import --connect jdbc:mysql://gaian-lap386.com/ambari1 --username ambari1 --password XXXXXXXXX --table emp -m 1
/usr/hdp/3.1.0.0-78/hadoop/libexec/hadoop-functions.sh: line 2363: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: bad substitution
/usr/hdp/3.1.0.0-78/hadoop/libexec/hadoop-functions.sh: line 2458: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: bad substitution
19/08/28 18:06:37 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
19/08/28 18:06:37 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/08/28 18:06:37 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/08/28 18:06:37 INFO tool.CodeGenTool: Beginning code generation
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
19/08/28 18:06:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
19/08/28 18:06:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
19/08/28 18:06:37 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/3.1.0.0-78/hadoop-mapreduce
Note: /tmp/sqoop-gaian/compile/a97d3d83e0f84d41242ebc74363df379/emp.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/08/28 18:06:39 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-gaian/compile/a97d3d83e0f84d41242ebc74363df379/emp.jar
19/08/28 18:06:39 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/08/28 18:06:39 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/08/28 18:06:39 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/08/28 18:06:39 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/08/28 18:06:39 INFO mapreduce.ImportJobBase: Beginning import of emp
19/08/28 18:06:40 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/08/28 18:06:41 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
19/08/28 18:06:42 INFO client.RMProxy: Connecting to ResourceManager at gaian-lap386.com/192.168.24.32:8050
19/08/28 18:06:42 INFO client.AHSProxy: Connecting to Application History server at gaian-lap386.com/192.168.24.32:10200
19/08/28 18:06:43 ERROR tool.ImportTool: Import failed: org.apache.hadoop.security.AccessControlException: Permission denied: user=gaian, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1857)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1841)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1800)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88)
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2417)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2391)
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1325)
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1322)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1339)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1314)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:162)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:113)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:148)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:200)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:173)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:270)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=gaian, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1857)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1841)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1800)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)

at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497)
at org.apache.hadoop.ipc.Client.call(Client.java:1443)
at org.apache.hadoop.ipc.Client.call(Client.java:1353)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:653)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2415)
... 29 more

avatar
Master Mentor

@Manoj690 

 

Looks like your previous error is resolved.   For the new error better you shoudl have opened a new thread to avoid confusion for other readers of the thread.

 

I see the new error is HDFS related (and completely unrelated to Mysql related error which you originally reported to this thread)

 

19/08/28 18:06:43 ERROR tool.ImportTool: Import failed: org.apache.hadoop.security.AccessControlException: Permission denied: user=gaian, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x

 

 

For the HDFS error you will need to make sure that the user  "gaian" who is running the sqoop job does not have thw write permission on . "/user" directory.

You might be able fix it by doing this:

 

 

# su - hdfs
# hdfs dfs -mkdir /user/gaian
# hdfs dfs -chown -R gaian:hadoop /user/gaian
# hdfs dfs -chmod -R 755 /user/gaian

 

.

If your question is answered then, Please make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Contributor

i forgot the password for $ su - hdfs

avatar
Master Mentor

@Manoj690 

 

Try this:   First switch to "root" user using "su - "  then from "root" user account run the "su - hdfs" command.

# su - 
# su - hdfs