sqoop import --connect jdbc:mysql://hostname/ambari1 --username ambari1 --password xxxx --table emp -m 1
/usr/hdp/3.1.0.0-78/hadoop/libexec/hadoop-functions.sh: line 2363: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: bad substitution
/usr/hdp/3.1.0.0-78/hadoop/libexec/hadoop-functions.sh: line 2458: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: bad substitution
19/09/04 12:57:17 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
19/09/04 12:57:17 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/09/04 12:57:17 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/09/04 12:57:17 INFO tool.CodeGenTool: Beginning code generation
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
19/09/04 12:57:18 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
19/09/04 12:57:18 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp` AS t LIMIT 1
19/09/04 12:57:18 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/3.1.0.0-78/hadoop-mapreduce
Note: /tmp/sqoop-gaian/compile/3ca3eff02e26141a16b357355f58c5e3/emp.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/09/04 12:57:20 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-name/compile/3ca3eff02e26141a16b357355f58c5e3/emp.jar
19/09/04 12:57:21 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/09/04 12:57:21 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/09/04 12:57:21 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/09/04 12:57:21 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/09/04 12:57:21 INFO mapreduce.ImportJobBase: Beginning import of emp
19/09/04 12:57:21 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/04 12:57:23 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
19/09/04 12:57:23 INFO client.RMProxy: Connecting to ResourceManager at hostname/192.168.24.32:8050
19/09/04 12:57:24 INFO client.AHSProxy: Connecting to Application History server at hostname/192.168.24.32:10200
19/09/04 12:57:24 ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://gaian-lap386.com:8020/user/gaian/emp already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:164)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:277)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:200)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:173)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:270)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Created on 09-04-2019 01:26 AM - edited 09-04-2019 01:45 AM
We see the failure is because of the following cause:
ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://gaian-lap386.com:8020/user/gaian/emp already exists
You already have the output directory in HDFS "hdfs://gaian-lap386.com:8020/user/gaian/emp" existing hence it is failing.
You can easily check it as
# su - hdfs -c "hdfs dfs -ls /user/gaian/emp"
So either specify a different table / target dir OR specify the following property in your SQOOP command.
"--delete-target-dir Delete the import target directory if it exists"
Please refer to the mentioned doc to know what it actually does and if it is feasible for you or not?
OR you can try defining a different "--target-dir" (to define another HDFS destination dir) in the command line.