Created on 01-31-2015 08:29 AM - edited 09-16-2022 02:20 AM
Hello,
We have installed CDH 5.3 on 20 nodes, which comes with both sqoop 1 and sqoop2.
I am trying to run a sqoop 1 job through command line interactive shell, but unable to as it throws execption. As I change the jar, the error differs.
DB: postgresql
I have kept the jar file for postgresql inside /var/lib/sqoop/
I tried to give all required permissions on the sqoop, but still unable to run it.
It says no db jar selected or read.
When I try to use sqoop 2 through Hue, It doesnot allow me to create sqoop job, it is like when you click on create new job, it does not go to next page.
Created 02-09-2015 09:53 PM
Hello abe,
Thanks for helping me, the issue has been resolved bby itself, do not know what happened, I felt there some kind of special characters auto generated on top of the directory name, hence it was saying no such file or directory, as soon as I deleted the directory and created the new one, it works fine.
Created 01-31-2015 08:53 AM
Following is the error I am getting.....
sqoop import --connect jdbc:postgresql://10.4.2.68:6453/snap318 --username ae -P --table lead_dmn --target-dir /user/njaiswal/cidb/prime
Warning: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/01/31 16:49:55 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.0
Enter password:
15/01/31 16:49:59 INFO manager.SqlManager: Using default fetchSize of 1000
15/01/31 16:49:59 INFO tool.CodeGenTool: Beginning code generation
15/01/31 16:49:59 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: org.postgresql.Driver
java.lang.RuntimeException: Could not load db driver class: org.postgresql.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:848)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:736)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:759)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:269)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:226)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Created 01-31-2015 09:24 AM
If I try same command from different node, the error differs....
sqoop import --connect jdbc:postgresql://10.4.2.68:6453/snap318 --username ae -P --table lead_dmn --target-dir /user/njaiswal/cidb/prime
Warning: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/01/31 17:22:18 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.0
Enter password:
15/01/31 17:22:22 INFO manager.SqlManager: Using default fetchSize of 1000
15/01/31 17:22:22 INFO tool.CodeGenTool: Beginning code generation
15/01/31 17:22:22 ERROR manager.SqlManager: Error executing statement: org.postgresql.util.PSQLException: ERROR: Encountered "AS" at line 1, column 29.
Was expecting one of:
<EOF>
";" ...
org.postgresql.util.PSQLException: ERROR: Encountered "AS" at line 1, column 29.
Was expecting one of:
<EOF>
";" ...
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2102)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1835)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:500)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:374)
at org.postgresql.jdbc2.AbstractJdbc2Connection.execSQLUpdate(AbstractJdbc2Connection.java:263)
at org.postgresql.jdbc2.AbstractJdbc2Connection.setTransactionIsolation(AbstractJdbc2Connection.java:829)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:883)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:736)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:759)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:269)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:226)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
15/01/31 17:22:22 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Created 02-04-2015 02:07 PM
Sqoop2 is going throgh a lot of changes and isn't supported in Hue in CDH 5.3.0. It should be in CDH 5.3.2 AFAIK.
The Sqoop1 command you are running have two different problems:
Created 02-04-2015 02:53 PM
Hi abe,
I am getting new issue on hdfs,
sqoop is running fine and it creates an output directory on the given --target-dir path /xxx.........but the issue is I can see the directory on hdfs, but cannot access, when given hadoop fs -ls /xxx , it says no such file or directory.....Pls help me to resolve this issue
Created 02-04-2015 03:03 PM
Also when I tried to access it through Hue.....It says following
You are a Hue admin but not a HDFS superuser (which is "hdfs").
Created 02-04-2015 03:38 PM
Created 02-04-2015 05:07 PM
Yes, the Job run successfully, but still I am unable to view the file as the directory which has it, doesnot open.......
sqoop ran successfully and retrieved 4 records, and also created the directory by itself when I specify explecitly from command line,
It is not allowing me to look into direcctory.......it says no such file or directory, but I can notice that it is present......something to do with permissions for hdfs??
Created 02-04-2015 05:27 PM
From the CLI, Hadoop normally shows: "ls: Permission denied: user=admin..." if you don't have permission. What is your --target-dir set to?
Created 02-04-2015 05:33 PM
You are right it should say permission denied, but strangely it says "No file or directory found" , but we can see that directory is present.....