Member since
06-22-2016
4
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5877 | 08-05-2016 12:42 PM |
08-05-2016
12:42 PM
oozie.libpath=${nameNode}/user/oozie/share/lib/sqoop and the database jar driver inside it solved the problem.
... View more
08-02-2016
09:13 AM
I am running sqoop jobs in paralell in a workflow and facing the same error. Since I am running the whole thing on the AWS cluster, I am convinced that it is not a memory problem. If any other please suggest. <?xml version="1.0" encoding="UTF-8" standalone="no"?> <workflow-app xmlns="uri:oozie:workflow:0.1" name="WorkflowWithSqoopAction"> <start to="fork-node"/> <fork name="fork-node"> <path start="BOOKS"/> <path start="SALES"/> <path start="EMPLOYEE"/> </fork> <action name="BOOKS"> <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <command>import --connect jdbc:oracle:thin:@ip:1521/ORCL --username=x --password=x --table=BOOKS</command> </sqoop> <ok to="joining"/> <error to="fail"/> </action> <action name="SALES"> <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <command>import --connect jdbc:oracle:thin:@ip:1521/ORCL --username=x --password=x --table=SALES</command> </sqoop> <ok to="joining"/> <error to="fail"/> </action> <action name="EMPLOYEE"> <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <command>import --connect jdbc:oracle:thin:@ip:1521/ORCL --username=x --password=x --table=EMPLOYEE</command> </sqoop> <ok to="joining"/> <error to="fail"/> </action> <join name="joining" to="end"/> <kill name="fail"> <message>Killed job due to error: ${wf:errorMessage(wf:lastErrorNode())}</message> </kill> <end name="end"/> </workflow-app> 4514 [uber-SubtaskRunner] WARN org.apache.sqoop.tool.SqoopTool - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration. 4538 [uber-SubtaskRunner] INFO org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.6-cdh5.5.2 4550 [uber-SubtaskRunner] WARN org.apache.sqoop.tool.BaseSqoopTool - Setting your password on the command-line is insecure. Consider using -P instead. 4561 [uber-SubtaskRunner] WARN org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration. 4636 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.oracle.OraOopManagerFactory - Data Connector for Oracle and Hadoop is disabled. 4650 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.SqlManager - Using default fetchSize of 1000 4650 [uber-SubtaskRunner] INFO org.apache.sqoop.tool.CodeGenTool - Beginning code generation 5065 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.OracleManager - Time zone has been set to GMT 5140 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM BOOKS t WHERE 1=0 5166 [uber-SubtaskRunner] INFO org.apache.sqoop.orm.CompilationManager - HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.5.2-1.cdh5.5.2.p0.4/lib/hadoop-mapreduce 6719 [uber-SubtaskRunner] INFO org.apache.sqoop.orm.CompilationManager - Writing jar file: /tmp/sqoop-yarn/compile/82166d0efd936226575485974d82d7b8/BOOKS.jar 6731 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.OracleManager - Time zone has been set to GMT 6738 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.OracleManager - Time zone has been set to GMT 6743 [uber-SubtaskRunner] INFO org.apache.sqoop.mapreduce.ImportJobBase - Beginning import of BOOKS 6764 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.OracleManager - Time zone has been set to GMT 6781 [uber-SubtaskRunner] WARN org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies. 7497 [uber-SubtaskRunner] INFO org.apache.sqoop.mapreduce.db.DBInputFormat - Using read commited transaction isolation 7497 [uber-SubtaskRunner] INFO org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat - BoundingValsQuery: SELECT MIN(BOOK_ID), MAX(BOOK_ID) FROM BOOKS 7501 [uber-SubtaskRunner] WARN org.apache.sqoop.mapreduce.db.TextSplitter - Generating splits for a textual index column. 7501 [uber-SubtaskRunner] WARN org.apache.sqoop.mapreduce.db.TextSplitter - If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records. 7501 [uber-SubtaskRunner] WARN org.apache.sqoop.mapreduce.db.TextSplitter - You are strongly encouraged to choose an integral split column. Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat Heart beat
... View more
06-22-2016
11:53 AM
Every time I run sqoop command using oozie, I get Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] error. 1. I have tried putting 'mysql connector' jar under lib folder, inside the folder that contains workflow.xml and have referenced it in workfow.xml too. 2. I have also put 'mysql connector' jar under /user/oozie/share/lib/sqoop and /user/oozie/share/lib/lib_20151118030154/sqoop and /user/oozie/share/lib and /user/oozie/share/lib/sqoop/lib 3. And have added oozie.use.system.libpath=true in job.properties file. I have restarted the server after placing the jar files each time before running the oozie command. Additionally I have tried downloading oozie-core/2.3.2-cdh3u3 jar from http://grepcode.com/file/repository.cloudera.com/content/repositories/releases/com.yahoo.oozie/oozie-core/2.3.2-cdh3u3/org/apache/oozie/action/hadoop/SqoopMain.java which has sqoopmain class and added to all possible paths where oozie-core jars are kept in the entire system. Even refrenced this jar file in workflow.xml under archive after placing in the same lib folder as 1. Still no luck. oozie is able to run map-reduce programs and other programs that I have mapped in workflow.xml but fails with the status as KILLED in hue for sqoop jobs. Details: job.properties nameNode=hdfs://quickstart.cloudera:8020 jobTracker=quickstart.cloudera:8032 queueName=default examplesRoot=examples oozie.libpath=${nameNode}/user/oozie/share/lib oozie.use.system.libpath=true oozie.wf.rerun.failnodes=true user.name=cloudera oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/map-reduce/workflow.xml outputDir=outputOzzie workflow.xml <workflow-app xmlns="uri:oozie:workflow:0.2" name="map-reduce-wf">
<start to="mr-node"/>
<action name="sqoopAction">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<command>import --connect jdbc:mysql://localhost/TEST --hive-import --table pet -m 1 --create-hive-table default.pet --username root --password cloudera</command>
<archive>/user/cloudera/examples/apps/map-reduce/lib/mysql-connector-java-5.1.15-bin.jar</archive>
</sqoop>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> Alternatively, I tried to put the sqoop command in a shell script and run it, it fails. If I write a map-reduce or echo commands in shell script, ozzie runs the job sucessfully but fails for sqoop commands with [org.apache.oozie.action.hadoop.ShellMain], exit code [1] error. job.properties nameNode=hdfs://quickstart.cloudera:8020 jobTracker=quickstart.cloudera:8032 queueName=default oozie.libpath=${nameNode}/user/oozie/share/lib oozie.use.system.libpath=true oozie.wf.rerun.failnodes=true user.name=cloudera oozie.wf.application.path=${nameNode}/user/cloudera/examples/shellscript/workflow.xml outputDir=SqoopShellOutput workflow.xml <workflow-app name="script_oozie_job" xmlns="uri:oozie:workflow:0.3">
<start to='Test' />
<action name="Test">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<exec>sqoopcmd.sh</exec>
<file>hdfs://quickstart.cloudera:8020/user/cloudera/examples/shellscript/sqoopcmd.sh#sqoopcmd.sh</file>
</shell>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Script failed</message>
</kill>
<end name='end' />
</workflow-app> sqoopcmd.sh #!/bin/bash # sqoop script sqoop import-all-tables \ -m 1 \ --connect jdbc:mysql://localhost/TEST \ --username=root \ --password=cloudera \ --warehouse-dir=/user/hive/warehouse \ --hive-import No dignostics regarding any jobs in YARN Resource Manager. Moreover, I have tried it in both the versions of cloudera, 5.7 and 5.5 but no luck. Please suggest how can I resolve this issue.
... View more
Labels: