Created 09-14-2016 07:31 PM
Hi,
Here are some general details of the setup:
HDP 2.4
kerberized cluster
connecting a simple action from HDFS to Hive table. There is a link to a sample with identical details: http://www.tanzirmusabbir.com/2013/03/oozie-example-hive-actions.html
At the bottom of the that link the author talks about the error I'm running into. Although after adding hive-site.xml to job-xml I still receive the same error. I also have the credentials setup like so:
<credentials> <credential name='hive_credentials' type='hcat'> <property> <name>hcat.metastore.uri</name> <value>thrift://x.x.x.x:9083</value> </property> <property> <name>hcat.metastore.principal</name> <value>hive/_HOST@HDP.FOO.LOCAL</value> </property> </credential> </credentials>
And using those in the action:
<action name="hive-node" retry-max="0" cred="hive_credentials"> <hive xmlns="uri:oozie:hive-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <job-xml>hive-site.xml</job-xml> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> </configuration> <script>script.hql</script> <param>INPUT_PATH=/user/oozie/importresult</param> </hive> <ok to="end"/> <error to="fail"/> </action>
script file:
use mydb; load data inpath '${INPUT_PATH}' overwrite into table test1;
Any help is appreciated (I've been trying to figure this one out for a bit).
Error full trace:
2016-09-14 17:02:12,201 ERROR [main] ql.Driver (SessionState.java:printError(962)) - FAILED: SemanticException [Error 10001]: Line 2:65 Table not found 'test1' org.apache.hadoop.hive.ql.parse.SemanticException: Line 2:65 Table not found 'test1' at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer$TableSpec.<init>(BaseSemanticAnalyzer.java:769) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer$TableSpec.<init>(BaseSemanticAnalyzer.java:731) at org.apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.analyzeInternal(LoadSemanticAnalyzer.java:199) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:459) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:316) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1189) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1237) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1126) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1116) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:314) at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:412) at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:428) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:717) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624) at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:306) at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:290) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47) at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:68) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by: org.apache.hadoop.hive.ql.metadata.InvalidTableException: Table not found test1 at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1122) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1073) at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1060) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer$TableSpec.<init>(BaseSemanticAnalyzer.java:766) ... 35 more
Created 09-14-2016 07:35 PM
Can you do this to see if the table does exist in hive. hive -e 'use mydb; show tables;' and see if this lists the table test1. If not then this would state why you are seeing the exception.
Created 09-14-2016 07:35 PM
Can you do this to see if the table does exist in hive. hive -e 'use mydb; show tables;' and see if this lists the table test1. If not then this would state why you are seeing the exception.
Created 09-14-2016 09:20 PM
Yeah I thought the OVERRIDE in the load data created it if it didn't exist (mapping to hdfs data). I tried other commands, and stuff is actually working now (show tables; create table etc.). I also had to move up atlas hook jars to oozie shared dir, along with mysql connector. Thanks so much for your response.
Created 01-06-2017 05:46 PM
@mbalakrishnan,
Can you explain why does your solution work? I had run into a similar problem and the above solution worked with my issue so I am curious to know the reason behind the same.
Regards,
Shashang Sheth
Created 01-06-2017 09:38 PM
I'm not sure what you mean. Basically that error from oozie can mean a number of things. e.g. the table actually doesn't exist, if the classpath isn't set correctly it will also give table not found, or if the metastore uris aren't set correct oozie may still fail with table not found.
Can you explain what "solution" worked for you?
Thanks
Created 01-09-2017 06:23 AM
When I do hive -e 'describe formatted ocods_temp.umoney_provision_allocation_details_xpose', it returns me the attached error. But if I do hive -e "use ocods_temp; describe formatted umoney_provision_allocation_details_xpose" , the command provides the required information. This concludes that there is not issue with the table as such.hive-error.txt