Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive load file into hive table from HDFS failing


Hive load file into hive table from HDFS failing


Hi All,


I am using talend 5.4 /5.5 to connect to cdh 5.1. A three node cluster 


N1: CM, HIVE(all the services ),Datanode, Zookeeper.... etc

N2:RM, Datanode

N3: Datanode


when I am trying to load data from hdfs to hive table is failing where as same command from cli works just fine.


hive> LOAD DATA  INPATH '/user/thor/test/rev_sub.txt' INTO TABLE revenue_subs;


when I am running the talend job with tHiveLoad component I am getting following exception


[INFO ]: hive.metastore - Trying to connect to metastore with URI thrift://txwlcloud1:9083
[WARN ]: - No groups available for user thor
[INFO ]: hive.metastore - Waiting 1 seconds before next connection attempt.
[INFO ]: hive.metastore - Connected to metastore.
[ERROR]: org.apache.hadoop.hive.ql.Driver - FAILED: SemanticException Line 1:17 Invalid path ''/user/thor/test/rev_sub.txt''
org.apache.hadoop.hive.ql.parse.SemanticException: Line 1:17 Invalid path ''/user/thor/test/rev_sub.txt''
at org.apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.applyConstraints(
at org.apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.analyzeInternal(
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(
at org.apache.hadoop.hive.ql.Driver.compile(
at org.apache.hadoop.hive.ql.Driver.compile(
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(
at org.apache.hive.service.cli.operation.SQLOperation.prepare(
at org.apache.hive.service.cli.operation.SQLOperation.prepare(
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(
at org.apache.hive.service.cli.CLIService.executeStatement(
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(
at org.apache.hive.jdbc.HiveStatement.execute(
at big_data.hivejob_0_1.HIVEJob.tHiveLoad_1Process(
at big_data.hivejob_0_1.HIVEJob.runJobInTOS(
at big_data.hivejob_0_1.HIVEJob.main(
Caused by: Failed on local exception: Message missing required fields: callId, status; Host Details : local host is: "TXWLHPW295/"; destination host is: "txwlcloud2":8020;
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)

I am struggling with this issue for a while. 


The possible reason could be 

1) jdbc driver issue

2) some thing to do with remote metastore


It will be great help if you guys could point out why the load is failing 





Re: Hive load file into hive table from HDFS failing


I guess, you might have missed some column names while creating the table,


Caused by: Failed on local exception:​: Message missing required fields: callId, status; Host Details : local host is: "TXWLHPW295/"; destination host is: "txwlcloud2":8020;


Do you have corresponding columns to store callId, Status. Probably you need to check your data file and the table created.




Don't have an account?
Coming from Hortonworks? Activate your account here