Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

[Oozie - Sqoop] can import but can't export

avatar
Explorer

Hi, first, i try to import a sample data from MySQL to HDFS, using oozie-sqoop workflow, evething OK.

 

Then i try to export the result back to Mysql, the sqoop export command is OK

Next, i use oozie-sqoop workflow and got the error :  Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

 

I've try manny way:

about file mysql-connector-java.....jar

+ I upload it to HDFS and add it to the file path.

+ I also upload it to /user/oozie/share/lib/lib_.../sqoop/ and also to /user/oozie/share/lib/sqoop/ and chmod 777 to it.

+ I also copy it to /opt/cloudera/parcels/CDH-5.3.2.../lib/sqoop/lib/ and to /var/lib/sqoop/ and chmod 777 too

 

here is the job definition:

 

<workflow-app name="sqoop_export" xmlns="uri:oozie:workflow:0.4">
<start to="export_potluck"/>
<action name="export_potluck">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<command>sqoop export --connect jdbc:mysql://192.168.6.10/mabu --username root --password 123456 --table potluck --export-dir /user/hue/mabu/test_e</command>
<file>/user/hue/mabu/oozie/mysql-connector-java-5.1.34-bin.jar#mysql-connector-java-5.1.34-bin.jar</file>
</sqoop>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>

 

and here is the job configuration: 

 

<configuration>
<property>
<name>hue-id-w</name>
<value>31</value>
</property>
<property>
<name>user.name</name>
<value>hue</value>
</property>
<property>
<name>oozie.use.system.libpath</name>
<value>true</value>
</property>
<property>
<name>mapreduce.job.user.name</name>
<value>hue</value>
</property>
<property>
<name>oozie.wf.rerun.failnodes</name>
<value>false</value>
</property>
<property>
<name>nameNode</name>
<value>hdfs://00master.mabu.com:8020</value>
</property>
<property>
<name>jobTracker</name>
<value>00master.mabu.com:8032</value>
</property>
<property>
<name>oozie.wf.application.path</name>
<value>hdfs://00master.mabu.com:8020/user/hue/oozie/workspaces/_hue_-oozie-31-1425982013.65</value>
</property>
</configuration>

 

Really appreciate the help, thanks !

 

1 ACCEPTED SOLUTION

avatar
Explorer

Found my solution, i need to add 2 file:

+ db.hsqldb.properties

+ dn.hsqldb.script

to the oozie job, then the job just work fine, still don't understand why because i don't need these 2 file when import.

View solution in original post

1 REPLY 1

avatar
Explorer

Found my solution, i need to add 2 file:

+ db.hsqldb.properties

+ dn.hsqldb.script

to the oozie job, then the job just work fine, still don't understand why because i don't need these 2 file when import.