Support Questions

Find answers, ask questions, and share your expertise

Hive Update - how to update a txt file in HDFS?

avatar
Expert Contributor

I have a txt file say elec.txt in my HDFS (Hadoop Distributed File System)

With the help of hadoop mapreduce and hive query I need to update the txt file

Is it possible to do so?

Is it possible to write a mapreduce program in hadoop to implement a HIVE Query?

1 ACCEPTED SOLUTION

avatar

Hi Babu - It's more of a common approach to write out a new file. HDFS is essentially an append only system so creating a new file that's a derivative of the original is a very common practice. You can write a MR program to output a file or use a Hive query to output a query results to a new file. For example, INSERT OVERWRITE DIRECTORY '/user/me/output' SELECT UPPER(myColumn) FROM myTable. This would create a new file(s) with a modified change that's like an update. In this case, we're upper casing the 'myColumn' in the myTable table.

View solution in original post

1 REPLY 1

avatar

Hi Babu - It's more of a common approach to write out a new file. HDFS is essentially an append only system so creating a new file that's a derivative of the original is a very common practice. You can write a MR program to output a file or use a Hive query to output a query results to a new file. For example, INSERT OVERWRITE DIRECTORY '/user/me/output' SELECT UPPER(myColumn) FROM myTable. This would create a new file(s) with a modified change that's like an update. In this case, we're upper casing the 'myColumn' in the myTable table.