I am using Quickstart VM 5.8.
I have loaded some flat files in HDFS .
I have created external table in hive as below :
CREATE External TABLE abc (ID int, Price double, Start_DTTM string, DEL_DT_TM string)
row format delimited fields terminated by ',' stored as textfile;
load data inpath '/user/cloudera/CPC/QSM/QSM_MarToApr2016.csv' into table abc;
Data loaded successfully in Hive table.
But in HDFS data is vanishing .
Thanks for the reply.
I have uploaded the falt file in HDFS location. (/user/clouder/QSM/)
And i created a table as above and loaded the data.
Data loaded successfully to hive.
But I dont want to move data to Hive warehouse.
Without vanishing data in HDFS. Hive results should come.
Please guide me.
While you create external table - mention the LOCATION ' ' ( i,e The default location of Hive table is overwritten by using LOCATION )
Then load data from HDFS using ' inpath ' - if you drop the table it will only remove the pointer from the hdfs and will not delete the data in the hdfs.
CREATE EXTERNAL TABLE text1 ( wban INT, date STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘,’ LOCATION ‘ /hive/data/text1’;
LOAD DATA INPATH ‘hdfs:/data/2000.txt’ INTO TABLE TABLE_NAME ;