Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Please see the Cloudera blog for information on the Cloudera Response to CVE-2021-4428

Run sqoop on oozie is faild from Workflow Manager

New Contributor

I don't have any log information for figure out this problem.

Workflow.xml

<?xml version="1.0" encoding="UTF-8" standalone="no"?><workflow-app xmlns="uri:oozie:workflow:0.5" name="employees">
    <start to="sqoop_1"/>
    <action name="sqoop_1">
        <sqoop xmlns="uri:oozie:sqoop-action:0.4">
            <job-tracker>${resourceManager}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
                <delete path="/user/admin/employees"/>
            </prepare>
            <configuration>
                <property>
                    <name>oozie.action.sharelib.for.sqoop</name>
                    <value>sqoop,hive,hcatalog</value>
                </property>
            </configuration>
            <command>import --connect jdbc:mysql://docker.for.mac.localhost:3306/employees --username root --password root --table employees --hive-import</command>
            <file>/user/oozie/share/lib/lib_20170728140342/hive/hive-conf.xml#hive-site.xml</file>
            <file>/user/oozie/share/lib/lib_20170728140342/tez/tez-conf.xml</file>
        </sqoop>
        <ok to="end"/>
        <error to="kill"/>
    </action>
    <kill name="kill">
        <message>${wf:errorMessage(wf:lastErrorNode())}</message>
    </kill>
    <end name="end"/>
</workflow-app>

Log Type: directory.info

Log Upload Time: Mon Oct 16 16:30:08 +0000 2017

Log Length: 58849

Showing 4096 bytes of 58849 total. Click here for the full log.

jaxb-impl-2.2.3-1.jar
538086   16 -r-xr-xr-x   1 yarn     hadoop      12452 Oct 16 11:57 ./geronimo-annotation_1.0_spec-1.1.1.jar
538196   28 -r-xr-xr-x   1 yarn     hadoop      26514 Oct 16 11:57 ./stax-api-1.0.1.jar
537971  164 -r-xr-xr-x   1 yarn     hadoop     166281 Oct 16 11:57 ./tez-runtime-internals-0.7.0.2.6.1.0-129.jar
538062  316 -r-xr-xr-x   1 yarn     hadoop     321639 Oct 16 11:57 ./httpcore-4.4.jar
536910   32 -r-xr-xr-x   1 yarn     hadoop      29555 Oct 16 11:57 ./paranamer-2.3.jar
538076 1964 -r-xr-xr-x   1 yarn     hadoop    2009163 Oct 16 11:57 ./datanucleus-core-4.1.6.jar
538127  104 -r-xr-xr-x   1 yarn     hadoop     105112 Oct 16 11:57 ./servlet-api-2.5.jar
538240  100 -r-xr-xr-x   1 yarn     hadoop     101126 Oct 16 11:57 ./hadoop-yarn-registry-2.7.3.2.6.1.0-129.jar
536849  560 -r-xr-xr-x   1 yarn     hadoop     570101 Oct 15 04:47 ./aws-java-sdk-s3-1.10.6.jar
538369   20 -r-xr-xr-x   1 yarn     hadoop      17392 Oct 16 11:57 ./oozie-sharelib-hcatalog-4.2.0.2.6.1.0-129.jar
538285  160 -r-xr-xr-x   1 yarn     hadoop     160519 Oct 16 11:57 ./commons-dbcp-1.4.jar
538250   28 -r-xr-xr-x   1 yarn     hadoop      26176 Oct 16 11:57 ./slf4j-api-1.6.6.jar
536843  224 -r-xr-xr-x   1 yarn     hadoop     225302 Oct 15 04:47 ./jackson-core-2.4.4.jar
538213  200 -r-xr-xr-x   1 yarn     hadoop     201124 Oct 16 11:57 ./jdo-api-3.0.1.jar
536871  100 -r-xr-xr-x   1 yarn     hadoop      99555 Oct 15 04:48 ./xz-1.5.jar
538417  256 -r-xr-xr-x   1 yarn     hadoop     262050 Oct 16 11:57 ./hive-hcatalog-core-1.2.1000.2.6.1.0-129.jar
538291  108 -r-xr-xr-x   1 yarn     hadoop     108763 Oct 16 11:57 ./hive-webhcat-java-client-1.2.1000.2.6.1.0-129.jar
538057  576 -r-xr-xr-x   1 yarn     hadoop     588337 Oct 16 11:57 ./commons-collections-3.2.2.jar
538375  480 -r-xr-xr-x   1 yarn     hadoop     489884 Oct 16 11:57 ./log4j-1.2.17.jar
538000   52 -r-xr-xr-x   1 yarn     hadoop      49707 Oct 16 11:57 ./hive-ant-1.2.1000.2.6.1.0-129.jar
538322  272 -r-xr-xr-x   1 yarn     hadoop     276425 Oct 16 11:57 ./tez-mapreduce-0.7.0.2.6.1.0-129.jar
538360   12 -r-xr-xr-x   1 yarn     hadoop       9711 Oct 16 11:57 ./slf4j-log4j12-1.6.6.jar
536880   36 -r-xr-xr-x   1 yarn     hadoop      34604 Oct 15 04:48 ./paranamer-2.7.jar
538159   96 -r-xr-xr-x   1 yarn     hadoop      96221 Oct 16 11:57 ./commons-pool-1.5.4.jar
538282   16 -r-xr-xr-x   1 yarn     hadoop      15071 Oct 16 11:57 ./jta-1.1.jar
536883  632 -r-xr-xr-x   1 yarn     hadoop     643727 Oct 15 04:48 ./hsqldb-1.8.0.7.jar
538174  704 -r-xr-xr-x   1 yarn     hadoop     719304 Oct 16 11:57 ./httpclient-4.4.jar
538030   44 -r-xr-xr-x   1 yarn     hadoop      41755 Oct 16 11:57 ./objenesis-2.1.jar
538102   64 -r-xr-xr-x   1 yarn     hadoop      64093 Oct 16 11:57 ./hive-bridge-0.8.0.2.6.1.0-129.jar
538147   32 -r-xr-xr-x   1 yarn     hadoop      30359 Oct 16 11:57 ./apache-curator-2.6.0.pom
537997  232 -r-xr-xr-x   1 yarn     hadoop     236660 Oct 16 11:57 ./ST4-4.0.4.jar
538045 1224 -r-xr-xr-x   1 yarn     hadoop    1251514 Oct 16 11:57 ./snappy-java-1.0.5.jar
538435 21432 -r-xr-xr-x   1 yarn     hadoop   21944115 Oct 16 15:31 ./hive-exec-1.2.1000.2.6.1.0-129.jar
538215   20 -r-xr-xr-x   1 yarn     hadoop      20463 Oct 16 11:57 ./oozie-sharelib-hive-4.2.0.2.6.1.0-129.jar
538264  112 -r-xr-xr-x   1 yarn     hadoop     112835 Oct 16 11:57 ./hive-shims-common-1.2.1000.2.6.1.0-129.jar
538315    4 -r-xr-xr-x   1 yarn     hadoop       2497 Oct 16 11:57 ./javax.inject-1.jar
538171   20 -r-xr-xr-x   1 yarn     hadoop      20133 Oct 16 11:57 ./avatica-metrics-1.8.0.2.6.1.0-129.jar
537987  380 -r-xr-xr-x   1 yarn     hadoop     388864 Oct 16 11:57 ./mail-1.4.jar
537991   24 -r-xr-xr-x   1 yarn     hadoop      21879 Oct 16 11:57 ./asm-tree-3.1.jar
3695476  312 -r-xr-xr-x   1 yarn     hadoop     319099 Oct 15 04:47 ./okhttp-2.4.0.jar
536862  220 -r-xr-xr-x   1 yarn     hadoop     222041 Oct 15 04:48 ./hadoop-aws-2.7.3.2.6.1.0-129.jar
538093   12 -r-xr-xr-x   1 yarn     hadoop      11485 Oct 16 11:57 ./hdfs-model-0.8.0.2.6.1.0-129.jar
broken symlinks(find -L . -maxdepth 5 -type l -ls):

Log Type: launch_container.sh

Log Upload Time: Mon Oct 16 16:30:08 +0000 2017

Log Length: 43303

Showing 4096 bytes of 43303 total. Click here for the full log.

ramework-2.6.0.jar" "curator-framework-2.6.0.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/186/jackson-xc-1.9.13.jar" "jackson-xc-1.9.13.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/107/metrics-jvm-3.1.0.jar" "metrics-jvm-3.1.0.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/74/commons-collections-3.2.2.jar" "commons-collections-3.2.2.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/171/json4s-native_2.11-3.2.11.jar" "json4s-native_2.11-3.2.11.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/19/jackson-core-2.4.4.jar" "jackson-core-2.4.4.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/113/httpclient-4.4.jar" "httpclient-4.4.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/145/hive-contrib-1.2.1000.2.6.1.0-129.jar" "hive-contrib-1.2.1000.2.6.1.0-129.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/136/protobuf-java-2.5.0.jar" "protobuf-java-2.5.0.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/143/hive-shims-common-1.2.1000.2.6.1.0-129.jar" "hive-shims-common-1.2.1000.2.6.1.0-129.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
ln -sf "/hadoop/yarn/local/filecache/97/servlet-api-2.5.jar" "servlet-api-2.5.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" "/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/launch_container.sh"
chmod 640 "/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/launch_container.sh"
# Determining directory contents
echo "ls -l:" 1>"/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/directory.info"
ls -l 1>>"/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/directory.info"
echo "find -L . -maxdepth 5 -ls:" 1>>"/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/directory.info"
find -L . -maxdepth 5 -ls 1>>"/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/directory.info"
find -L . -maxdepth 5 -type l -ls 1>>"/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -XX:NewRatio=8 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.6.1.0-129 -Xmx200m -Xmx200m -Djava.io.tmpdir=$PWD/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 172.17.0.2 46409 attempt_1507942765781_0017_m_000000_0 2 1>/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/stdout 2>/hadoop/yarn/log/application_1507942765781_0017/container_1507942765781_0017_01_000002/stderr "
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
  exit $hadoop_shell_errorcode
fi

Log Type: stderr

Log Upload Time: Mon Oct 16 16:30:08 +0000 2017

Log Length: 1171

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/24/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/175/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Note: /tmp/sqoop-yarn/compile/2588b43edf56c5b45109989cf4ff80e8/employees.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

Logging initialized using configuration in jar:file:/hadoop/yarn/local/filecache/51/hive-common-1.2.1000.2.6.1.0-129.jar!/hive-log4j.properties
Intercepting System.exit(1)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
log4j:WARN No appenders could be found for logger (org.apache.hadoop.ipc.Client).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

Log Type: stdout

Log Upload Time: Mon Oct 16 16:30:08 +0000 2017

Log Length: 299872

Showing 4096 bytes of 299872 total. Click here for the full log.

ime spent by all map tasks (ms)=130984
		Total vcore-milliseconds taken by all map tasks=130984
		Total megabyte-milliseconds taken by all map tasks=32746000
	Map-Reduce Framework
		Map input records=300024
		Map output records=300024
		Input split bytes=464
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=939
		CPU time spent (ms)=24960
		Physical memory (bytes) snapshot=637845504
		Virtual memory (bytes) snapshot=8631586816
		Total committed heap usage (bytes)=182452224
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=13821993
68457 [main] INFO  org.apache.hadoop.mapreduce.Job  - Counters: 30
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=1623588
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=464
		HDFS: Number of bytes written=13821993
		HDFS: Number of read operations=16
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=8
	Job Counters 
		Launched map tasks=4
		Other local map tasks=4
		Total time spent by all maps in occupied slots (ms)=130984
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=130984
		Total vcore-milliseconds taken by all map tasks=130984
		Total megabyte-milliseconds taken by all map tasks=32746000
	Map-Reduce Framework
		Map input records=300024
		Map output records=300024
		Input split bytes=464
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=939
		CPU time spent (ms)=24960
		Physical memory (bytes) snapshot=637845504
		Virtual memory (bytes) snapshot=8631586816
		Total committed heap usage (bytes)=182452224
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=13821993
68483 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Transferred 13.1817 MB in 57.8593 seconds (233.2908 KB/sec)
68483 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Transferred 13.1817 MB in 57.8593 seconds (233.2908 KB/sec)
68486 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Retrieved 300024 records.
68486 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Retrieved 300024 records.
68487 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Publishing Hive/Hcat import job data to Listeners
68487 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Publishing Hive/Hcat import job data to Listeners
68557 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `employees` AS t LIMIT 1
68557 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `employees` AS t LIMIT 1
68573 [main] WARN  org.apache.sqoop.hive.TableDefWriter  - Column birth_date had to be cast to a less precise type in Hive
68573 [main] WARN  org.apache.sqoop.hive.TableDefWriter  - Column birth_date had to be cast to a less precise type in Hive
68573 [main] WARN  org.apache.sqoop.hive.TableDefWriter  - Column hire_date had to be cast to a less precise type in Hive
68573 [main] WARN  org.apache.sqoop.hive.TableDefWriter  - Column hire_date had to be cast to a less precise type in Hive
68578 [main] INFO  org.apache.sqoop.hive.HiveImport  - Loading uploaded data into Hive
68578 [main] INFO  org.apache.sqoop.hive.HiveImport  - Loading uploaded data into Hive

<<< Invocation of Sqoop command completed <<<

Hadoop Job IDs executed by Sqoop: job_1507942765781_0018

Intercepting System.exit(1)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

Oozie Launcher failed, finishing Hadoop job gracefully

Oozie Launcher, uploading action data to HDFS sequence file: hdfs://sandbox.hortonworks.com:8020/user/admin/oozie-oozi/0000000-171014010245511-oozie-oozi-W/sqoop_1--sqoop/action-data.seq
Successfully reset security manager from org.apache.oozie.action.hadoop.LauncherSecurityManager@b6bccb4 to null

Oozie Launcher ends

Log Type: syslog

Log Upload Time: Mon Oct 16 16:30:08 +0000 2017

Log Length: 2414

2017-10-16 16:28:47,810 INFO [main] org.apache.hadoop.security.SecurityUtil: Updating Configuration
2017-10-16 16:28:47,913 WARN [main] org.apache.hadoop.metrics2.impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-maptask.properties,hadoop-metrics2.properties
2017-10-16 16:28:48,049 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2017-10-16 16:28:48,049 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started
2017-10-16 16:28:48,070 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:
2017-10-16 16:28:48,070 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1507942765781_0017, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@4c2bb6e0)
2017-10-16 16:28:48,389 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: RM_DELEGATION_TOKEN, Service: 172.17.0.2:8032, Ident: (owner=admin, renewer=oozie mr token, realUser=oozie, issueDate=1508171306987, maxDate=1508776106987, sequenceNumber=45, masterKeyId=4)
2017-10-16 16:28:48,478 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.
2017-10-16 16:28:48,950 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /hadoop/yarn/local/usercache/admin/appcache/application_1507942765781_0017
2017-10-16 16:28:49,519 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2017-10-16 16:28:50,359 INFO [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output Committer Algorithm version is 1
2017-10-16 16:28:50,359 INFO [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2017-10-16 16:28:50,388 INFO [main] org.apache.hadoop.mapred.Task:  Using ResourceCalculatorProcessTree : [ ]
2017-10-16 16:28:50,716 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: org.apache.oozie.action.hadoop.OozieLauncherInputFormat$EmptySplit@3d4d3fe7
2017-10-16 16:28:50,735 INFO [main] org.apache.hadoop.mapred.MapTask: numReduceTasks: 0 

2017-10-16 16:28:50,834 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id

0 REPLIES 0