Member since
05-19-2016
216
Posts
20
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4188 | 05-29-2018 11:56 PM | |
7021 | 07-06-2017 02:50 AM | |
3765 | 10-09-2016 12:51 AM | |
3530 | 05-13-2016 04:17 AM |
05-06-2016
12:29 PM
@Artem Ervits: I have the exact same problem. MySQL is hosted on an altogether different system which is not a cluster in my hadoop system but only hosts the mySQL database. I have user root@'IP' granted with all permission and I can connect through the command mysql -u -p -h but it is not throws the mentioned error whenever I try to connect via sqoop.What else could possibly be the problem?How do I fix this?
... View more
05-06-2016
12:17 PM
I have the exact same problem. MySQL is hosted on an altogether different system which is not a cluster in my hadoop system but only hosts the mySQL database. I have user root@'IP' granted with all permission and I can connect through the command mysql -u -p -h but it is not throws the mentioned error whenever I try to connect via sqoop.What else could possibly be the problem?How do I fix this? @neeraj
... View more
05-06-2016
07:16 AM
Awesome 🙂 worked like a charm. Thanks
... View more
05-06-2016
07:14 AM
You saved me. I just was not able to figure out what the "launcher logs " were. These logs are helpful. Thanks
... View more
05-06-2016
06:40 AM
1 Kudo
How to fix this error? va.io.IOException: SQLException in nextKeyValue at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.sql.SQLException: Value '0000-00-00' can not be represented as java.sql.Date
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:957)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:896)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:885)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:860)
at com.mysql.jdbc.ResultSetRow.getDateFast(ResultSetRow.java:145)
at com.mysql.jdbc.ByteArrayRow.getDateFast(ByteArrayRow.java:243)
at com.mysql.jdbc.ResultSetImpl.getDate(ResultSetImpl.java:2015)
at com.mysql.jdbc.ResultSetImpl.getDate(ResultSetImpl.java:1978)
at org.apache.sqoop.lib.JdbcWritableBridge.readDate(JdbcWritableBridge.java:115)
at com.cloudera.sqoop.lib.JdbcWritableBridge.readDate(JdbcWritableBridge.java:87)
at additional_input_invoice.readFields(additional_input_invoice.java:325)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:244)
... 12 more
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Sqoop
05-06-2016
06:22 AM
@Kuldeep KulkarniOh, I forgot to mention that it was only single table import that i had tried through command line. Also, could you please clarify how do I import all tables in a particular directory?
... View more
05-06-2016
05:57 AM
@Kuldeep Kulkarni : Are you sure that --target-dir works with import-all-tables? As this link says, it does not. Also, what should be the command to import all tables in hive without having to create the tables first in hive? https://groups.google.com/forum/#!msg/chennaihug/hWAKeO9Lnh0/y7RBYtti6AgJ
... View more
05-06-2016
05:43 AM
1 Kudo
I am trying to run a sqoop command through oozie and it says : Unrecognized argument: --target-dir in oozie launcher logs. This is what my command looks like: import-all-tables --driver com.mysql.jdbc.Driver --connect jdbc:mysql://warehouse.swtched.com/erp --username hive --password hive -m 1 --target-dir sblDW The target-dir attribute works just fine when I am running in through command line. How do I fix this? error: >>> Invoking Sqoop command line now >>>
2893 [main] WARN org.apache.sqoop.tool.SqoopTool - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
2016-05-06 11:11:34,741 WARN [main] tool.SqoopTool (SqoopTool.java:loadPluginsFromConfDir(177)) - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
2941 [main] INFO org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.6.2.4.0.0-169
2016-05-06 11:11:34,789 INFO [main] sqoop.Sqoop (Sqoop.java:<init>(97)) - Running Sqoop version: 1.4.6.2.4.0.0-169
2965 [main] WARN org.apache.sqoop.tool.BaseSqoopTool - Setting your password on the command-line is insecure. Consider using -P instead.
2016-05-06 11:11:34,813 WARN [main] tool.BaseSqoopTool (BaseSqoopTool.java:applyCredentialsOptions(1026)) - Setting your password on the command-line is insecure. Consider using -P instead.
2967 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool - Error parsing arguments for import-all-tables:
2016-05-06 11:11:34,815 ERROR [main] tool.BaseSqoopTool (BaseSqoopTool.java:hasUnrecognizedArgs(304)) - Error parsing arguments for import-all-tables:
2967 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: --target-dir
2016-05-06 11:11:34,815 ERROR [main] tool.BaseSqoopTool (BaseSqoopTool.java:hasUnrecognizedArgs(307)) - Unrecognized argument: --target-dir
2967 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: sblDW
2016-05-06 11:11:34,815 ERROR [main] tool.BaseSqoopTool (BaseSqoopTool.java:hasUnrecognizedArgs(307)) - Unrecognized argument: sblDW
Intercepting System.exit(1)
... View more
Labels:
- Labels:
-
Apache Oozie
05-06-2016
04:46 AM
@Kuldeep Kulkarni: Thanks. I have updated. Please check
... View more
05-05-2016
01:52 PM
1 Kudo
I can use list-databases and list tables command in oozie for the sqoop action.
But, import command throws,
Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
I understand that it's a generic error but I tried looking through logs of oozie/hive and sqoop but nothing in there has any sort of information for the error (ofcourse except for oozie logs where this error is only as much as I have got with no caused by or potential reasons. I have an oozie workflow: The folder-->lib/mysql connector -->workflow.xml Taking hints from other such problems, I added jdbc jar to /user/oozie/share/lib/lib_timestamp/sqoop/ but still gives the same error. Also, added it to lib folder in the workflow dir. as well. But not sure how that works. Do I need to reference this somewhere or oozie picks that up automatically? workflow.xml looks like this: <workflow-app name="once-a-day" xmlns="uri:oozie:workflow:0.5">
<start to="sqoopAction"/>
<action name="sqoopAction">
<sqoop xmlns="uri:oozie:sqoop-action:0.3">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>oozie.libpath</name>
<value>hdfs://FQDN:8020/user/oozie/share/lib</value>
</property>
<property>
<name>oozie.use.system.libpath</name>
<value>true</value>
</property>
<property>
<name>queueName</name>
<value>default</value>
</property>
<property>
<name>oozie.action.sharelib.for.sqoop</name>
<value>hive,hcatalog,sqoop</value>
</property>
<property>
<name>workflowAppUri</name>
<value>hdfs://FQDN:8020/user/oozie/scheduledimport</value>
</property>
<property>
<name>start</name>
<value>2016-04-26T00:00Z</value>
</property>
<property>
<name>end</name>
<value>2016-12-31T00:00Z</value>
</property>
<property>
<name>jobTracker</name>
<value>FQDN:8050</value>
</property>
<property>
<name>oozie.coord.application.path</name>
<value>hdfs://FQDN:8020/user/oozie/scheduledimport</value>
</property>
<property>
<name>nameNode</name>
<value>hdfs://FQDN:8020</value>
</property>
</configuration>
<command>import-all-tables --connect jdbc:mysql://FQDN/erp --username hive --password hive
</command>
</sqoop>
<ok to="end"/>
<error to="killJob"/>
</action>
<kill name="killJob">
<message>"Killed job due to error: ${wf:errorMessage(wf:lastErrorNode())}"</message>
</kill>
<end name="end" />
</workflow-app> job.properties: nameNode=hdfs://FQDN:8020
jobTracker=FQDN:8050
queueName=default
oozie.libpath=/user/oozie/share/lib/
oozie.use.system.libpath=true
oozie.wf.application.path=${nameNode}/user/oozie/scheduledimport
start=2016-04-26T00:00Z
end=2016-12-31T00:00Z
workflowAppUri=${nameNode}/user/oozie/scheduledimport Oozie logs taken from launcher Logged in as: dr.who
Application
AboutJobsTools
Log Type: directory.info
Log Upload Time: Fri May 06 10:33:18 +0530 2016
Log Length: 172674
Showing 4096 bytes of 172674 total. Click here for the full log.
5 17:21 ./hadoop-yarn-registry-2.7.1.2.4.0.0-169.jar
68419834 44 -r-xr-xr-x 1 yarn hadoop 43033 May 5 17:21 ./asm-3.1.jar
68419849 212 -r-xr-xr-x 1 yarn hadoop 213854 May 5 17:21 ./jline-2.12.jar
68421231 4 -rw-r--r-- 1 yarn hadoop 16 May 6 10:33 ./.default_container_executor.sh.crc
68419858 1848 -r-xr-xr-x 1 yarn hadoop 1890075 May 5 17:21 ./datanucleus-core-3.2.10.jar
68420005 8 -r-xr-xr-x 1 yarn hadoop 4467 May 5 17:21 ./aopalliance-1.0.jar
68419765 60 -r-xr-xr-x 1 yarn hadoop 61333 May 5 17:21 ./hive-shims-0.23-1.2.1000.2.4.0.0-169.jar
68419906 6228 -r-xr-xr-x 1 yarn hadoop 6377448 May 5 17:21 ./groovy-all-2.1.6.jar
68419867 32 -r-xr-xr-x 1 yarn hadoop 32693 May 5 17:21 ./asm-commons-3.1.jar
68421224 4 -rw-r--r-- 1 yarn hadoop 250 May 6 10:33 ./container_tokens
68421225 4 -rw-r--r-- 1 yarn hadoop 12 May 6 10:33 ./.container_tokens.crc
68419873 132 -r-xr-xr-x 1 yarn hadoop 132368 May 5 17:21 ./servlet-api-2.5-6.1.14.jar
68419900 5772 -r-xr-xr-x 1 yarn hadoop 5907375 May 5 17:21 ./hive-metastore-1.2.1000.2.4.0.0-169.jar
68419747 16 -r-xr-xr-x 1 yarn hadoop 13422 May 5 17:21 ./hive-shims-scheduler-1.2.1000.2.4.0.0-169.jar
68419645 380 -r-xr-xr-x 1 yarn hadoop 388864 May 5 17:21 ./mail-1.4.jar
68419599 40 -r-xr-xr-x 1 yarn hadoop 39670 May 5 17:21 ./hive-cli-1.2.1000.2.4.0.0-169.jar
68419864 20 -r-xr-xr-x 1 yarn hadoop 18336 May 5 17:21 ./ant-launcher-1.9.1.jar
68419831 1024 -r-xr-xr-x 1 yarn hadoop 1045744 May 5 17:21 ./leveldbjni-all-1.8.jar
68419975 104 -r-xr-xr-x 1 yarn hadoop 105134 May 5 17:21 ./jaxb-api-2.2.2.jar
68419663 1920 -r-xr-xr-x 1 yarn hadoop 1964393 May 5 17:21 ./hive-service-1.2.1000.2.4.0.0-169.jar
68419927 44 -r-xr-xr-x 1 yarn hadoop 41123 May 5 17:21 ./commons-cli-1.2.jar
68419756 12 -r-xr-xr-x 1 yarn hadoop 12284 May 5 17:21 ./oozie-hadoop-utils-hadoop-2-4.2.0.2.4.0.0-169.jar
68419784 32 -r-xr-xr-x 1 yarn hadoop 32331 May 5 17:21 ./hive-shims-0.20S-1.2.1000.2.4.0.0-169.jar
68419637 1224 -r-xr-xr-x 1 yarn hadoop 1251514 May 5 17:21 ./snappy-java-1.0.5.jar
68419750 64 -r-xr-xr-x 1 yarn hadoop 65261 May 5 17:21 ./oro-2.0.8.jar
68419729 104 -r-xr-xr-x 1 yarn hadoop 105112 May 5 17:21 ./servlet-api-2.5.jar
68419930 48 -r-xr-xr-x 1 yarn hadoop 48516 May 5 17:21 ./hive-ant-1.2.1000.2.4.0.0-169.jar
68419852 228 -r-xr-xr-x 1 yarn hadoop 232248 May 5 17:21 ./jackson-core-asl-1.9.13.jar
68419720 340 -r-xr-xr-x 1 yarn hadoop 346729 May 5 17:21 ./apache-log4j-extras-1.1.jar
68419972 900 -r-xr-xr-x 1 yarn hadoop 918372 May 5 17:21 ./hive-serde-1.2.1000.2.4.0.0-169.jar
68419768 532 -r-xr-xr-x 1 yarn hadoop 542268 May 5 17:21 ./tez-runtime-library-0.7.0.2.4.0.0-169.jar
68419624 16 -r-xr-xr-x 1 yarn hadoop 16030 May 5 17:21 ./geronimo-jta_1.1_spec-1.1.1.jar
68419807 12 -r-xr-xr-x 1 yarn hadoop 12131 May 5 17:21 ./jpam-1.1.jar
68420035 736 -r-xr-xr-x 1 yarn hadoop 751238 May 5 17:21 ./commons-collections4-4.1.jar
68419678 16 -r-xr-xr-x 1 yarn hadoop 12452 May 5 17:21 ./geronimo-annotation_1.0_spec-1.1.1.jar
68419639 532 -r-xr-xr-x 1 yarn hadoop 541070 May 5 17:21 ./zookeeper-3.4.6.2.4.0.0-169-tests.jar
68419711 8 -r-xr-xr-x 1 yarn hadoop 7779 May 5 17:21 ./tez-yarn-timeline-history-with-acls-0.7.0.2.4.0.0-169.jar
68420002 20 -r-xr-xr-x 1 yarn hadoop 18336 May 5 17:21 ./jackson-jaxrs-1.9.13.jar
68420014 184 -r-xr-xr-x 1 yarn hadoop 185245 May 5 17:21 ./curator-framework-2.6.0.jar
68419593 232 -r-xr-xr-x 1 yarn hadoop 236660 May 5 17:21 ./ST4-4.0.4.jar
68419717 256 -r-xr-xr-x 1 yarn hadoop 258686 May 5 17:21 ./hive-hcatalog-core-1.2.1000.2.4.0.0-169.jar
broken symlinks(find -L . -maxdepth 5 -type l -ls):
Log Type: launch_container.sh
Log Upload Time: Fri May 06 10:33:18 +0530 2016
Log Length: 35182
Showing 4096 bytes of 35182 total. Click here for the full log.
0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/35/hive-service-1.2.1000.2.4.0.0-169.jar" "hive-service-1.2.1000.2.4.0.0-169.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/11/fst-2.24.jar" "fst-2.24.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/121/calcite-linq4j-1.2.0.2.4.0.0-169.jar" "calcite-linq4j-1.2.0.2.4.0.0-169.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/129/opencsv-2.3.jar" "opencsv-2.3.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/43/parquet-hadoop-bundle-1.6.0.jar" "parquet-hadoop-bundle-1.6.0.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/102/ant-launcher-1.9.1.jar" "ant-launcher-1.9.1.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/77/xz-1.0.jar" "xz-1.0.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/57/servlet-api-2.5.jar" "servlet-api-2.5.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/164/mysql-connector-java-5.1.38-bin.jar" "mysql-connector-java-5.1.38-bin.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
ln -sf "/customYarnDirectory/filecache/54/apache-log4j-extras-1.1.jar" "apache-log4j-extras-1.1.jar"
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
# Creating copy of launch script
cp "launch_container.sh" "/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/launch_container.sh"
chmod 640 "/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/launch_container.sh"
# Determining directory contents
echo "ls -l:" 1>"/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/directory.info"
ls -l 1>>"/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/directory.info"
echo "find -L . -maxdepth 5 -ls:" 1>>"/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/directory.info"
find -L . -maxdepth 5 -ls 1>>"/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/directory.info"
find -L . -maxdepth 5 -type l -ls 1>>"/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/directory.info"
exec /bin/bash -c "$JAVA_HOME/bin/java -server -XX:NewRatio=8 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.4.0.0-169 -Xmx200m -Xmx10240m -Djava.io.tmpdir=./tmp -Djava.io.tmpdir=$PWD/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog org.apache.hadoop.mapred.YarnChild 10.10.10.9 48256 attempt_1462448478130_0014_m_000000_0 6597069766658 1>/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/stdout 2>/tmp/hadoop/yarn/log/application_1462448478130_0014/container_e06_1462448478130_0014_01_000002/stderr "
hadoop_shell_errorcode=$?
if [ $hadoop_shell_errorcode -ne 0 ]
then
exit $hadoop_shell_errorcode
fi
Log Type: stderr
Log Upload Time: Fri May 06 10:33:18 +0530 2016
Log Length: 679
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/hadoop/yarn/local/filecache/10/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/customYarnDirectory/filecache/142/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Try --help for usage instructions.
Intercepting System.exit(1)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Log Type: stdout
Log Upload Time: Fri May 06 10:33:18 +0530 2016
Log Length: 248850
Showing 4096 bytes of 248850 total. Click here for the full log.
per-disk-mb=1000
mapreduce.client.completion.pollinterval=5000
dfs.namenode.name.dir.restore=true
dfs.namenode.full.block.report.lease.length.ms=300000
dfs.namenode.secondary.http-address=warehouse.swtched.com:50090
s3.bytes-per-checksum=512
yarn.resourcemanager.max-log-aggregation-diagnostics-in-memory=10
yarn.resourcemanager.webapp.https.address=warehouse.swtched.com:8090
------------------------
Sqoop command arguments :
import-all-tables
--connect
jdbc:mysql://warehouse.swtched.com/erp
--username
hive
--password
hive
--target-dir
/sqlDW
Fetching child yarn jobs
tag id : oozie-d5f40d82e377a34b2285044af691b6c3
2016-05-06 10:33:11,085 INFO [main] client.RMProxy (RMProxy.java:createRMProxy(98)) - Connecting to ResourceManager at warehouse.swtched.com/10.10.10.9:8050
Child yarn jobs are found -
=================================================================
>>> Invoking Sqoop command line now >>>
2195 [main] WARN org.apache.sqoop.tool.SqoopTool - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
2016-05-06 10:33:11,292 WARN [main] tool.SqoopTool (SqoopTool.java:loadPluginsFromConfDir(177)) - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
2223 [main] INFO org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.6.2.4.0.0-169
2016-05-06 10:33:11,320 INFO [main] sqoop.Sqoop (Sqoop.java:<init>(97)) - Running Sqoop version: 1.4.6.2.4.0.0-169
2238 [main] WARN org.apache.sqoop.tool.BaseSqoopTool - Setting your password on the command-line is insecure. Consider using -P instead.
2016-05-06 10:33:11,335 WARN [main] tool.BaseSqoopTool (BaseSqoopTool.java:applyCredentialsOptions(1026)) - Setting your password on the command-line is insecure. Consider using -P instead.
2238 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool - Error parsing arguments for import-all-tables:
2016-05-06 10:33:11,335 ERROR [main] tool.BaseSqoopTool (BaseSqoopTool.java:hasUnrecognizedArgs(304)) - Error parsing arguments for import-all-tables:
2238 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: --target-dir
2016-05-06 10:33:11,335 ERROR [main] tool.BaseSqoopTool (BaseSqoopTool.java:hasUnrecognizedArgs(307)) - Unrecognized argument: --target-dir
2239 [main] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: /sqlDW
2016-05-06 10:33:11,336 ERROR [main] tool.BaseSqoopTool (BaseSqoopTool.java:hasUnrecognizedArgs(307)) - Unrecognized argument: /sqlDW
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Oozie Launcher failed, finishing Hadoop job gracefully
Oozie Launcher, uploading action data to HDFS sequence file: hdfs://warehouse.swtched.com:8020/user/oozie/oozie-oozi/0000000-160506102107434-oozie-oozi-W/sqoopAction--sqoop/action-data.seq
2016-05-06 10:33:11,406 INFO [main] zlib.ZlibFactory (ZlibFactory.java:<clinit>(49)) - Successfully loaded & initialized native-zlib library
2016-05-06 10:33:11,407 INFO [main] compress.CodecPool (CodecPool.java:getCompressor(153)) - Got brand-new compressor [.deflate]
Oozie Launcher ends
2016-05-06 10:33:11,425 INFO [main] mapred.Task (Task.java:done(1038)) - Task:attempt_1462448478130_0014_m_000000_0 is done. And is in the process of committing
2016-05-06 10:33:11,515 INFO [main] mapred.Task (Task.java:commit(1199)) - Task attempt_1462448478130_0014_m_000000_0 is allowed to commit now
2016-05-06 10:33:11,524 INFO [main] output.FileOutputCommitter (FileOutputCommitter.java:commitTask(582)) - Saved output of task 'attempt_1462448478130_0014_m_000000_0' to hdfs://warehouse.swtched.com:8020/user/oozie/oozie-oozi/0000000-160506102107434-oozie-oozi-W/sqoopAction--sqoop/output/_temporary/1/task_1462448478130_0014_m_000000
2016-05-06 10:33:11,601 INFO [main] mapred.Task (Task.java:sendDone(1158)) - Task 'attempt_1462448478130_0014_m_000000_0' done.
Log Type: syslog
Log Upload Time: Fri May 06 10:33:18 +0530 2016
Log Length: 2396
2016-05-06 10:33:09,097 WARN [main] org.apache.hadoop.metrics2.impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-maptask.properties,hadoop-metrics2.properties
2016-05-06 10:33:09,248 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2016-05-06 10:33:09,248 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started
2016-05-06 10:33:09,268 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:
2016-05-06 10:33:09,269 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1462448478130_0014, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@505fc5a4)
2016-05-06 10:33:09,364 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: RM_DELEGATION_TOKEN, Service: 10.10.10.9:8050, Ident: (owner=oozie, renewer=oozie mr token, realUser=oozie, issueDate=1462510977736, maxDate=1463115777736, sequenceNumber=41, masterKeyId=14)
2016-05-06 10:33:09,452 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.
2016-05-06 10:33:09,750 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /tmp/hadoop/yarn/local/usercache/oozie/appcache/application_1462448478130_0014,/customYarnDirectory/usercache/oozie/appcache/application_1462448478130_0014
2016-05-06 10:33:10,007 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2016-05-06 10:33:10,486 INFO [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output Committer Algorithm version is 1
2016-05-06 10:33:10,486 INFO [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2016-05-06 10:33:10,496 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : [ ]
2016-05-06 10:33:10,705 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: org.apache.oozie.action.hadoop.OozieLauncherInputFormat$EmptySplit@7db82169
2016-05-06 10:33:10,712 INFO [main] org.apache.hadoop.mapred.MapTask: numReduceTasks: 0
2016-05-06 10:33:10,753 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
... View more
Labels:
- Labels:
-
Apache Oozie