Problem loading data to table in this tutorial
<code>LOAD DATA INPATH '/user/admin/Batting.csv' OVERWRITE INTO TABLE temp_batting;
H110 Unable to submit statement. Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [admin] does not have [READ] privilege on [hdfs://sandbox.hortonworks.com:8020/user/admin/elecMonthly_Orc] [ERROR_STATUS]
I created both user/admin and temp/admin folders. I used hdfs superuser to make admin owner of file, folder, and even parent folder. I gave full permissions in HDFS. and this is clearly shown in Ambari. Error persists.
Can anyone help? Thanks
It was because when I thought I was creating an elecMonthly_Orc file, actually I created an elecMonthly_Orc folder with several files three files: _SUCCESS, part-r-r-00000-1a0c14e3-0dd0-42db-abc7-7f655a02f634.orc ... and another similar orc files. The files within the elecMonthly_Orc directory were owned by Hive, and that's why the permissions error.
Resolved by using command line as superuser hdfs:
hadoop fs -chown admin:admin /user/admin/elecMonthly_Orc/*.*
Now I just have to figure out how to recombine Orc files in Hive!
Thanks Andrew. Think it was discongruity between Hive account running statement and Admin owing file. Thanks for this and sorry for delay in reply.
Are you using the latest sandbox? Try the Ambari Hive view, it has the Upload File action specifically for csv files.
Thanks for answering so quickly!
admin is running statement as per tutorial
I thought I did hdfs chown on files. Shown in Ambari as owned by admin