Support Questions

Find answers, ask questions, and share your expertise

Sqoop Error, Sqoop 2 not working through hue

avatar
Expert Contributor

Hello,

 

We have installed CDH 5.3 on 20 nodes, which comes with both sqoop 1 and sqoop2.

 

I am trying to run a sqoop 1 job through command line interactive shell, but unable to as it throws execption. As I change the jar, the error differs.

 

DB: postgresql

 

I have kept the jar file for postgresql inside /var/lib/sqoop/

 

I tried to give all required permissions on the sqoop, but still unable to run it.

 

It says no db jar selected or read.

 

When I try to use sqoop 2 through Hue, It doesnot allow me to create sqoop job, it is like when you click on create new job, it does not go to next page.

1 ACCEPTED SOLUTION

avatar
Expert Contributor

Hello abe,

 

Thanks for helping me, the issue has been resolved bby itself, do not know what happened, I felt there some kind of special characters auto generated on top of the directory name, hence it was saying no such file or directory, as soon as I deleted the directory and created the new one, it works fine. 

View solution in original post

15 REPLIES 15

avatar
Expert Contributor

Hey abe ,

 

I am running the sqoop command on node 1 and the cluster comprises of 20 nodes..... 1nn , 19 dn...

 

I can access hdfs from all the nodes, also able to access directories that was created using hadoop fs -mkdir /user/xyz/abc......., The only issue is unable to access directory created by sqoop import, though the sqoop job was successful

avatar
Expert Contributor

Hello abe, 

 

following is the sqoop command:

 

sqoop import --connect jdbc:mysql://ip-address/db --username abc --password cde --query "select * from hfpoverty WHERE hindus_females > 10 AND \$CONDITIONS" --target-dir /user/jais/test3 -m 1

avatar
Expert Contributor

So, this directory has been created test3........but unable to access it as it says no such file or directory........

avatar
Expert Contributor

Hello abe,

 

Thanks for helping me, the issue has been resolved bby itself, do not know what happened, I felt there some kind of special characters auto generated on top of the directory name, hence it was saying no such file or directory, as soon as I deleted the directory and created the new one, it works fine. 

avatar
New Contributor

I have somewhat similar issue but my error is little different.

 

Command line -

sudo -u hdfs sqoop import --connect jdbc:oracle:thin:@//BIDEVDC01.gmpvt.net:1521/BIDEV --table CM_GIS_OUTAGE_EXTRACT --fields-terminated-by '\t' --username DWADM --password dwadm --target-dir /user/etl/sqoop/init/CM_GIS_OUTAGE_EXTRACT --verbose --connection-manager org.apache.sqoop.manager.OracleManager --num-mappers 4 --mapreduce-job-name CM_GIS_OUTAGE_EXTRACT --direct --split-by DEVICE_OID --hive-import &> sqoop-CM_GIS_OUTAGE_EXTRACT.log

 

ERROR -

15/02/18 17:16:14 INFO mapreduce.Job: map 0% reduce 0%
15/02/18 17:16:14 INFO mapreduce.Job: Job job_1423869234667_0007 failed with state FAILED due to: Application application_1423869234667_0007 failed 2 times due to AM Container for appattempt_1423869234667_0007_000002 exited with exitCode: -1000 due to: Application application_1423869234667_0007 initialization failed (exitCode=255) with output: main : command provided 0
main : user is nobody
main : requested yarn user is hdfs
Can't create directory /data1/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data10/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data11/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data12/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data2/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data3/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data4/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data5/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data6/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data7/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data8/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Can't create directory /data9/yarn/nm/usercache/hdfs/appcache/application_1423869234667_0007 - Permission denied
Did not create any app directories

avatar
New Contributor

Nara,

 

you could try deleting the /data1/yarn/nm/usercache/hdfs directory from your datanodes - as suggested in here:

http://community.cloudera.com/t5/Batch-Processing-and-Workflow/Can-t-create-directory-yarn-nm-userca...

 

That worked for me.