Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Oozie fails with: Oozie submit sqoop which execute hive script

Solved Go to solution
Highlighted

Oozie fails with: Oozie submit sqoop which execute hive script

Explorer

Hi:

I used oozie execute a sqoop script, which import mysql data to hive table. But the program is error.

the main of "workflow.xml"

<action name="sqoop-node"> <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> </configuration> <command>import --connect jdbc:mysql://xxxx:3306/xxxx --username xxx--password xxx --table area --hive-import</command> </sqoop>

The program have save the mysql data to hdfs,but can not import to hive.

The log said :

ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Hive exited with status 1 at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:394) at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:344) at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:245) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) at org.apache.sqoop.Sqoop.main(Sqoop.java:244) at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197) at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)

But I can execute under the shell mode.

sqoop import --connect jdbc:mysql://xxxx:3306/xxxx --username xxx--password xxx --table area --hive-import

is ok.

Anyone know how to solute this problem ?

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Explorer
11 REPLIES 11
Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Expert Contributor

Can you verify if you have added the hive-site.xml to HDFS and included a reference to that file in your workflow? I don't see it referenced in the the sqoop action.

Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Explorer

Thanks for your response.

I use the HDP2.3 and don't change the config file.

the hive-site.xml include the hdfs

But I don't know how to add reference file to my workflow.

Can you tell me?

--------------------------------------------------

<workflow-app xmlns="uri:oozie:workflow:0.2" name="sqoop-wf">

<start to="sqoop-node"/>

<action name="sqoop-node">

<sqoop xmlns="uri:oozie:sqoop-action:0.2">

<job-tracker>${jobTracker}</job-tracker>

<name-node>${nameNode}</name-node>

<configuration>

<property>

<name>mapred.job.queue.name</name>

<value>${queueName}</value>

</property>

</configuration>

<command>import --hive-import --connect jdbc:mysql://xxxxxx:3306/xxxx --username root --password xxxxx --table area</command>

</sqoop>

<ok to="end"/>

<error to="fail"/>

</action>

<kill name="fail">

<message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>

</kill>

<end name="end"/>

</workflow-app>

--------------------------------------------------

Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Explorer

Please to see the comment that I have responded to Kuldeep Kulkarni. I have added the hive-site.xml

<configuration>

<property>

<name>mapred.job.queue.name</name>

<value>${queueName}</value> </property>

</configuration>

<command>import --hive-import --connect jdbc:mysql://xxxx:3306/xxxx --username xx --password xxxx --table area</command>

<file>hive-site.xml#hive-site.xml</file>

</sqoop>

Now, I encounter the question "InterceptingSystem.exit(1)".

Can you help me?

Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Super Guru

@allen huang

you can add hive-site.xml location(hdfs) under <file> tag in your sqoop action.

            <strong><file>/tmp/hive-site.xml#hive-site.xml</file></strong>
Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Explorer

I have added hive-site.xml to the directory of workflow.xml add <file>hive-site.xml#hive-site.xml</file> to workflow.xml, I get the same errors.

So what can I do next?

-----------------------------------------------------

<start to="sqoop-node"/>

<action name="sqoop-node">

<sqoop xmlns="uri:oozie:sqoop-action:0.2">

<job-tracker>${jobTracker}</job-tracker>

<name-node>${nameNode}</name-node>

<prepare>

<delete path="${nameNode}/user/${wf:user()}/area"/>

</prepare>

<configuration>

<property>

<name>mapred.job.queue.name</name>

<value>${queueName}</value> </property>

</configuration>

<command>import --hive-import --connect jdbc:mysql://xxxx:3306/xxxx --username xx --password xxxx --table area</command>

<file>hive-site.xml#hive-site.xml</file>

</sqoop>

<ok to="end"/>

<error to="fail"/>

</action>

<kill name="fail">

<message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>

</kill>

<end name="end"/>

</workflow-app>

-----------------------------------------------------

Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Explorer

Oh, This question have been solved. I forgot to input packages under the hive_home/lib.

But, I found another question.

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

The Log said:

-------------------------------------------------------

2016-03-29 16:32:29,564 WARN  [main] util.MRApps (MRApps.java:parseDistributedCacheArtifacts(610)) - cache file (mapreduce.job.cache.files) hdfs://cluster1.new/user/root/share/lib/sqoop/mysql-connector-java-5.1.38-bin.jar conflicts with cache file (mapreduce.job.cache.files) hdfs://cluster1.new/user/root/.staging/job_1459235663418_0020/libjars/mysql-connector-java-5.1.38-bin.jar This will be an error in Hadoop 2.0
2016-03-29 16:32:29,801 INFO  [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(274)) - Submitted application application_1459235663418_0020
2016-03-29 16:32:29,823 INFO  [main] mapreduce.Job (Job.java:submit(1294)) - The url to track the job: http://cluster2.new:8088/proxy/application_1459235663418_0020/
2016-03-29 16:32:29,824 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1339)) - Running job: job_1459235663418_0020
2016-03-29 16:32:33,869 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1360)) - Job job_1459235663418_0020 running in uber mode : false
2016-03-29 16:32:33,870 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) -  map 0% reduce 0%
2016-03-29 16:32:38,898 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) -  map 100% reduce 0%
2016-03-29 16:32:39,904 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1378)) - Job job_1459235663418_0020 completed successfully
2016-03-29 16:32:39,938 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1385)) - Counters: 30
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=1377552
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=545
		HDFS: Number of bytes written=1090
		HDFS: Number of read operations=16
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=8
	Job Counters 
		Launched map tasks=4
		Other local map tasks=4
		Total time spent by all maps in occupied slots (ms)=8075
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=8075
		Total vcore-seconds taken by all map tasks=8075
		Total megabyte-seconds taken by all map tasks=16537600
	Map-Reduce Framework
		Map input records=51
		Map output records=51
		Input split bytes=545
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=160
		CPU time spent (ms)=3120
		Physical memory (bytes) snapshot=923926528
		Virtual memory (bytes) snapshot=14719983616
		Total committed heap usage (bytes)=886046720
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=1090
14714 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Transferred 1.0645 KB in 11.9988 seconds (90.8424 bytes/sec)
2016-03-29 16:32:39,944 INFO  [main] mapreduce.ImportJobBase (ImportJobBase.java:runJob(184)) - Transferred 1.0645 KB in 11.9988 seconds (90.8424 bytes/sec)
14716 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Retrieved 51 records.
2016-03-29 16:32:39,946 INFO  [main] mapreduce.ImportJobBase (ImportJobBase.java:runJob(186)) - Retrieved 51 records.
14727 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `area` AS t LIMIT 1
2016-03-29 16:32:39,957 INFO  [main] manager.SqlManager (SqlManager.java:execute(757)) - Executing SQL statement: SELECT t.* FROM `area` AS t LIMIT 1
14732 [main] INFO  org.apache.sqoop.hive.HiveImport  - Loading uploaded data into Hive
2016-03-29 16:32:39,962 INFO  [main] hive.HiveImport (HiveImport.java:importTable(195)) - Loading uploaded data into Hive
Intercepting System.exit(1)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

------------------------------------------------------------

I don't know how to find the error place.

The log didn't say anything except "Intercepting System.exit(1)".

Had anyone encountered this problem?

Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Mentor

Please refer to this article to troubleshoot Oozie workflows https://community.hortonworks.com/content/kbentry/9148/troubleshooting-an-oozie-flow.html

Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Explorer

Oh, you give me the URL which teach me how to find the error log.

But my question is :

I found the error log, but I couldn't locate the question from the error log.

The log said:

--------------------------------------------------

14732[main] INFO org.apache.sqoop.hive.HiveImport-Loading uploaded data intoHive

2016-03-2916:32:39,962 INFO [main] hive.HiveImport(HiveImport.java:importTable(195))-Loading uploaded data intoHive

InterceptingSystem.exit(1)

<<<Invocation of Mainclass completed <<<

FailingOozieLauncher,Mainclass[org.apache.oozie.action.hadoop.SqoopMain],exit code [1]

-------------------------------------------------------------------------

The job had finished and didn't say any error information, but it said "InterceptingSystem.exit(1)" finally, I don't know why the job failed.

Highlighted

Re: Oozie fails with: Oozie submit sqoop which execute hive script

Super Guru

@allen huang - If you know the application id then you can get logs via command line which would give you some clue.

yarn logs -applicationId <application-id>
Don't have an account?
Coming from Hortonworks? Activate your account here