Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Issue with: Hive Action - Oozie

avatar
Explorer

Hi,

 

I am trying to use hive with Oozie using hive action. The Oozie workflow is supposed to load data from one Hive table to another. I have a table foo in Hive and it is supposed to load data into table "test".

 

I am using Cloudera VM with Hadoop 2.0.0-cdh4.4.0.


I run the workflow using below command:

[cloudera@localhost oozie-3.3.2+92]$ oozie job -oozie http://localhost:11000/oozie -config examples/apps/hive/job.properties -run


When I go to the JobTracker log file it says: Table not found 'foo'. Any help?

 

--

cat script.q:

 

REATE EXTERNAL TABLE test (
id int,
name string
)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
STORED AS TEXTFILE
LOCATION
'/user/cloudera/test';

INSERT OVERWRITE table test SELECT * FROM foo;

--

 

cat job.properties:

 

nameNode=hdfs://localhost.localdomain:8020
jobTracker=localhost.localdomain:8021
queueName=default
examplesRoot=examples

oozie.use.system.libpath=true

oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/hive


--

 

cat workflow.xml:

 

<?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns="uri:oozie:workflow:0.2" name="hive-wf">
<start to="hive-node"/>

<action name="hive-node">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<script>script.q</script>
</hive>
<ok to="end"/>
<error to="fail"/>
</action>

<kill name="fail">
<message>Hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>

</workflow-app>


==

 

[cloudera@localhost hive]$ pwd
/usr/share/doc/oozie-3.3.2+92/examples/apps/hive

 

==

 

Current (local) dir = /mapred/local/taskTracker/cloudera/jobcache/job_201405081447_0019/attempt_201405081447_0019_m_000000_0/work
------------------------
hive-exec-log4j.properties
.action.xml.crc
tmp
hive-log4j.properties
hive-site.xml
action.xml
script.q
------------------------


Script [script.q] content:
------------------------
CREATE EXTERNAL TABLE test (
id int,
name string
)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
STORED AS TEXTFILE
LOCATION
'/user/cloudera/test';

INSERT OVERWRITE table test SELECT * FROM foo;

------------------------

Hive command arguments :
-f
script.q

=================================================================

>>> Invoking Hive command line now >>>

Hadoop Job IDs executed by Hive:

Intercepting System.exit(10001)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10001]

Oozie Launcher failed, finishing Hadoop job gracefully


Oozie Launcher ends

 

stderr logs

Logging initialized using configuration in jar:file:/mapred/local/taskTracker/distcache/9141962611866023942_1400842701_327187723/localhost.localdomain/user/oozie/share/lib/hive/hive-common-0.10.0-cdh4.4.0.jar!/hive-log4j.properties
Hive history file=/tmp/mapred/hive_job_log_eecd5d6b-69d3-4dbd-94ed-9c86ef42443d_1563998739.txt
OK
Time taken: 9.816 seconds
FAILED: SemanticException [Error 10001]: Line 3:42 Table not found 'foo'
Log file: /mapred/local/taskTracker/cloudera/jobcache/job_201405081447_0019/attempt_201405081447_0019_m_000000_0/work/hive-oozie-job_201405081447_0019.log not present. Therefore no Hadoop jobids found
Intercepting System.exit(10001)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10001]

 

syslog logs

2014-05-12 10:12:10,156 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2014-05-12 10:12:11,099 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /mapred/local/taskTracker/distcache/-2339055663322524001_1176285901_1902801582/localhost.localdomain/user/cloudera/examples/apps/hive/script.q <- /mapred/local/taskTracker/cloudera/jobcache/job_201405081447_0019/attempt_201405081447_0019_m_000000_0/work/script.q
2014-05-12 10:12:11,231 WARN org.apache.hadoop.conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id
2014-05-12 10:12:11,231 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=MAP, sessionId=
2014-05-12 10:12:11,544 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
2014-05-12 10:12:11,549 INFO org.apache.hadoop.mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@375e293a
2014-05-12 10:12:11,755 INFO org.apache.hadoop.mapred.MapTask: Processing split: hdfs://localhost.localdomain:8020/user/cloudera/oozie-oozi/0000014-140508144817449-oozie-oozi-W/hive-node--hive/input/dummy.txt:0+5
2014-05-12 10:12:11,773 WARN mapreduce.Counters: Counter name MAP_INPUT_BYTES is deprecated. Use FileInputFormatCounters as group name and BYTES_READ as counter name instead
2014-05-12 10:12:11,775 INFO org.apache.hadoop.mapred.MapTask: numReduceTasks: 0


==

 

Thanks,

 

Rio

1 ACCEPTED SOLUTION

avatar
Explorer

I added this line in workflow.xml and it resolved the issue. I also copied the hive-site.xml to workflow directory.

 

 <job-xml>hive-site.xml</job-xml>

View solution in original post

4 REPLIES 4

avatar
Explorer

I added this line in workflow.xml and it resolved the issue. I also copied the hive-site.xml to workflow directory.

 

 <job-xml>hive-site.xml</job-xml>

avatar
Guru

Thank you for reporting the solution back to us!  Glad it's resolved.

avatar
Explorer

From where should I copy hive-site.xml?  I tried a couple that I found under /opt/cloudera/parcels individually but no joy.

avatar
Mentor
You will need the gateway copy, which exists under /etc/hive/conf/ on a Hive Gateway designated node (check Hive -> Instances in CM to find which hosts have a gateway role).