I am using CDH5 version. I am trying to run a small import query through SQOOP to test how the file is getting created in HDFS. It is strucked after displaying the JOB_ID. It is neither executed nor errored out. Please check the below image for reference.
sqoop import --connect jdbc:oracle:thin:*****/*****@hldbtest.global.triniti.com:1522:R1224DEV --query "SELECT ORDER_NUMBER,ORG_ID,HEADER_ID,CREATION_DATE FROM OE_ORDER_HEADERS_ALL WHERE \$CONDITIONS AND ROWNUM<=10" --split-by HEADER_ID --target-dir /user/hdfs/sqoop
I have explored on the above query, I explored about it and came to know that it could be the issue of mapreduce job.
For example: To check whether the mapreduce job is running or not, I executed the below command in putty which is neither success nor failure.
hadoop jar /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar pi 10 100
Please help in resolving this.
Thanks for the reply.
I submitted the command once again in putty and checked the status in resource manager UI, as mentioned in ur reply it is in "ACCEPTED" status.
Please check the screenshots below.
Any update? Please help..... Awaiting for your reply... I am not able to do anything due to this...