Community Articles

Find and share helpful community-sourced technical articles.
Announcements
Celebrating as our community reaches 100,000 members! Thank you!
Labels (2)
avatar
Cloudera Employee

ISSUE: Sqooping data from hdfs to mysql db works when we run it from command line.

sqoop import --connect jdbc:mysql://mysqlserver.somedomain.com:3306/sample --username user1 --password password --table sample_test --target-dir /user/user1/data
When we attempt to run the same in an oozie workflow the job fails with the following error;
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
SOLUTION: Setting the path to the oozie.libpath in the job.properties and copying the database connector jars to this directory resolved this issue. Original job.properties:
oozie.use.system.libpath=True
security_enabled=True
nameNode=hdfs://HD0
credentials={u'hcat': {'xml_name': u'hcat', 'properties': [('hcat.metastore.uri', u'thrift://server.somedomain.com:9083'), ('hcat.metastore.principal', u'hive/server.somedomain.com@SOMEREALM.COM')]}, u'hive2': {'xml_name': u'hive2', 'properties': [('hive2.jdbc.url', 'jdbc:hive2://server.somedomain.com:10000/default'), ('hive2.server.principal', 'hive/server.somedomain.com@SOMEREALM.COM')]}, None: {'xml_name': None, 'properties': []}}
jobTracker=hd0
mapreduce.job.user.name=user1
oozie.wf.application.path=/user/user1/a/workflow.xml
oozie.wf.rerun.failnodes=false
security_enabled=True
user.name=user1
New job.properties:
oozie.use.system.libpath=True
oozie.libpath=${nameNode}/user/oozie/share/lib/sqoop
security_enabled=True
nameNode=hdfs://HD0
credentials={u'hcat': {'xml_name': u'hcat', 'properties': [('hcat.metastore.uri', u'thrift://server.somedomain.com:9083'), ('hcat.metastore.principal', u'hive/server.somedomain.com@SOMEREALM.COM')]}, u'hive2': {'xml_name': u'hive2', 'properties': [('hive2.jdbc.url', 'jdbc:hive2://server.somedomain.com:10000/default'), ('hive2.server.principal', 'hive/server.somedomain.com@SOMEREALM.COM')]}, None: {'xml_name': None, 'properties': []}}
jobTracker=hd0
mapreduce.job.user.name=user1
oozie.wf.application.path=/user/user1/a/workflow.xml
oozie.wf.rerun.failnodes=false
security_enabled=True
user.name=user1
After making this change you need to copy the database connector jar file to the same path set for oozie.libpath.
hdfs dfs -put /usr/share/java/mysql-connector-java.jar /usr/oozie/share/lib/sqoop/
3,301 Views
0 Kudos
Comments
avatar
New Contributor

It's a Oozie ShareLib problem. The script below works for my:


**At Shell**

    sudo -u hdfs hadoop fs -chown cloudera:cloudera /user/oozie/share/lib/lib_20170719053712/sqoop
    hdfs dfs -put /var/lib/sqoop/mysql-connector-java.jar /user/oozie/share/lib/lib_20170719053712/sqoop
    sudo -u hdfs hadoop fs -chown oozie:oozie /user/oozie/share/lib/lib_20170719053712/sqoop
    
    oozie admin -oozie http://localhost:11000/oozie -sharelibupdate
    oozie admin -oozie http://localhost:11000/oozie -shareliblist sqoop


**At Hue Sqoop Client**


    sqoop list-tables --connect jdbc:mysql://localhost/retail_db --username root --password cloudera


More detail at:


https://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/


Version history
Last update:
‎04-03-2017 03:51 PM
Updated by:
Contributors