Support Questions

Find answers, ask questions, and share your expertise

sqoop 1.4.6.2.6.3.0-235 import failing

avatar
Super Collaborator

following sqoop command is failing with the error shown below

sqoop job --create incjob  -- import --connect "jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=patronQA)(port=1526))(connect_data=(service_name=patron)))" --username PATRON --password patvps --as-avrodatafile --incremental append --check-column PUR_TRANS_DATE --table PATRON.TAB4 --split-by TAB4.PUR_DET_ID --compression-codec snappy --target-dir /sqoop-avro
sqoop job --exec incjob

command error

18/03/27 13:23:24 INFO mapreduce.Job: Task Id : attempt_1522170856018_0003_m_000000_2, Status : FAILED
Error: org.apache.avro.reflect.ReflectData.addLogicalTypeConversion(Lorg/apache/avro/Conversion;)V
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143.

yarn application log output

echo "Copying debugging information"
# Creating copy of launch script
cp "launch_container.sh" "/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/launch_container.sh"
chmod 640 "/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/launch_container.sh"
# Determining directory contents
echo "ls -l:" 1>"/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/directory.info"
ls -l 1>>"/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/directory.info"
echo "find -L . -maxdepth 5 -ls:" 1>>"/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/directory.info"
find -L . -maxdepth 5 -ls 1>>"/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/directory.info"
find -L . -maxdepth 5 -type l -ls 1>>"/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/directory.info"
echo "Launching container"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -XX:NewRatio=8 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.6.3.0-235 -Xmx1228m -Djava.io.tmpdir=$PWD/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 10.100.44.20 59891 attempt_1522170856018_0003_m_000000_2 8796093022213 1>/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/stdout 2>/hadoop/yarn/log/application_1522170856018_0003/container_e08_1522170856018_0003_01_000005/stderr "
End of LogType:launch_container.sh.This log file belongs to a running container (container_e08_1522170856018_0003_01_000005) and so may not be complete.
************************************************************************************
1 ACCEPTED SOLUTION

avatar

@Sami Ahmad

Add the following property to your sqoop job.

-Dmapreduce.job.user.classpath.first=true

Try this and let me know if that works for you.

View solution in original post

5 REPLIES 5

avatar
Super Collaborator

I narrowed the error , it is causing by the "--as-avrodatafile" option , if I remove this option sqoop works fine.

but I do want this option since I want to create AVRO file .

avatar

@Sami Ahmad

Add the following property to your sqoop job.

-Dmapreduce.job.user.classpath.first=true

Try this and let me know if that works for you.

avatar
Super Collaborator

thanks that worked but its not working in scoop job , probably wrong syntax ?

[hdfs@hadoop1 ~]$ sqoop job --create incjob4  -- import -Dmapreduce.job.user.classpath.first=true --connect "jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=patronQA)(port=1526))(connect_data=(service_name=patron)))" --username PATRON --password patvps --as-avrodatafile --incremental append --check-column PUR_TRANS_DATE --table PATRON.TAB4 --split-by TAB4.PUR_DET_ID --compression-codec snappy --target-dir /sqoop-avro
Warning: /usr/hdp/2.6.3.0-235/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/03/27 14:36:49 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.3.0-235
18/03/27 14:36:50 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
18/03/27 14:36:50 ERROR tool.BaseSqoopTool: Unrecognized argument: -Dmapreduce.job.user.classpath.first=true
18/03/27 14:36:50 ERROR tool.BaseSqoopTool: Unrecognized argument: --connect
18/03/27 14:36:50 ERROR tool.BaseSqoopTool: Unrecognized argument: jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=patronQA)(port=1526))(connect_data=(service_name=patron)))
18/03/27 14:36:50 ERROR tool.BaseSqoopTool: Unrecognized argument: --username
18/03/27 14:36:50 ERROR tool.BaseSqoopTool: Unrecognized argument: PATRON
18/03/27 

avatar

@Sami Ahmad The syntax is this

sqoop job (generic-args) (job-args)

So try changing your sqoop job to something like

sqoop job -Dmapreduce.job.user.classpath.first=true --create incjob4  -- import <Everything else>

Let know if that works!

avatar
Super Collaborator

yes that worked . thanks