Community Articles

Find and share helpful community-sourced technical articles.
avatar
Master Mentor

WebHCat

# this will execute a hive query and save result to hdfs file in your home directory called output

curl -s -d execute="select+*+from+sample_08;" \
  -d statusdir="output" \ 
   'http://localhost:50111/templeton/v1/hive?user.name=root'

# if you ls on the directory, it will have two files, stderr and stdout

hdfs dfs -ls output

# if the job succeeded, you can cat the stdout file and view the results

hdfs dfs -cat output/stdout  

WebHDFS

# list the output directory, notice the webhdfs port

curl -i "http://sandbox.hortonworks.com:50070/webhdfs/v1/user/root/output/?op=LISTSTATUS"

# read the output file

curl -i -L "http://sandbox.hortonworks.com:50070/webhdfs/v1/user/root/output/stdout?op=OPEN"

# rename a file, if you get dr. who error, add &user.name=root or any other user in the context

curl -i -X PUT "sandbox.hortonworks.com:50070/webhdfs/v1/user/root/output/stdout?op=RENAME&user.name=root&destination=/user/root/newname"

# read the output of the new file

curl -i -L "http://sandbox.hortonworks.com:50070/webhdfs/v1/user/root/newname?op=OPEN"
11,895 Views
Comments
avatar
Contributor

A great beginning! Thanks!

avatar
Contributor

Hello @Artem Ervits

To read data from WebHcat do I have to put the data inside HDFS? Is there a way to read this data directly via Rest API?

I mean, the data of the table, not the metadata.

Thank you so much