I want to change my external table hdfs location to new path location which is Amazon S3 in my case.
I tried following query.
ALTER TABLE table_name set location 's3n://bucket/path/to/data'
But some how it is still pointing to old hdfs external path.
Is there any query I need to use in order to update hive metastore with new external data path location.
Any kind of help would be greatly appreciated .
what happened right after you executed the Alter table command? Did you get any errors?
I am assuming, you tried describe extended <table_name> to determine the location that it is referring to??
DROP the current table (files on HDFS are not affected for external tables), and create a new one with the same name pointing to your S3 location.
Check if you have provided the aws access keys correctly and if there are any exceptions reported in hive client log (e.g /tmp/<user>/hive.log).
on hive terminal run below command
alter table FpML_Data set location hdfs:/file_path_in_HDFS;
HDFS: is value against fs.defaultFS property in core-site.xml
Reply my comment if in case any query...
Note, when you change the location of the file by using alter command, the old data file is not moved to new location.
On your issue, 1) do you have any data files in the mentioned path?
2) Did you get any warnings / errors while you executed this ALTER command?