Created 03-20-2017 02:45 PM
I found this article from Hortonworks about importing tsv files , how can I apply the same for csv files?
Created 03-20-2017 03:49 PM
Please include the full exception. I would guess that your classpath is wrong, causing this to not find your HBase instance.
Created 03-20-2017 07:07 PM
hi pbarna can you try your create table command and see if it works for you? its not working for me.
thanks
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException org.apache.hadoop.hive.hbase.HBaseSerDe: columns has 6 elements while hbase.columns.mapping has 7 elements (counting the key if implicit)) [hbase@hadoop1 ~]$
Created 03-20-2017 07:23 PM
Swap ":id" with ":key" in the hbase.columns.mapping. Just a simple typo. See https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration for documentation on configuring the HBaseStorageHandler.
Created 03-21-2017 02:32 PM
I solved this problem using the following method , but I do want to know why would one want to use the SERDE method and not this one?
[hbase@hadoop1 ~]$ more a.csv 5842,50,30,4,240,340 5843,52,32,5,250,360 5844,56,31,2,248,333 [hbase@hadoop1 ~]$ create table test3(Id int, lowT string, highT string,vib int, lowP string,highP string) ROW FORMAT DELIMITED FIELDS TERMINATED BY "," STORED AS TEXTFILE TBLPROPERTIES("skip.header.line.count"="1"); load data inpath '/user/hbase/a.csv' OVERWRITE INTO TABLE test3; Loading data to table default.test3 Table default.test3 stats: [numFiles=1, numRows=0, totalSize=63, rawDataSize=0] OK Time taken: 0.668 seconds