I was trying to upload a database table in Hive View. But after choose the file and make the relevant settings, and then when I hit the UPLOAD TABLE Option, I get the following error :
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=hive, path="file:/":root:root:drwxr-xr-x)
I was denied permission to upload the table. I am using the account of maria_dev which is there by default in Ambari UI.
Can somebody help me to sort out this error ?
On running this command in hortonworks sandbox terminal, I got the following error :
Exception in thread "main" java.lang.RuntimeException: core-site.xml not found at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2640) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2566) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2451) at org.apache.hadoop.conf.Configuration.set(Configuration.java:1164) at org.apache.hadoop.conf.Configuration.set(Configuration.java:1136) at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1472) at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:321) at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:487) at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170) at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.fs.FsShell.main(FsShell.java:356)
I used to get a similar error earlier too. Please help me sort this out. 🙂
The root directory permission as "root:root" does not look right. Is it Non HDFS filesystem permission? As in your output we see "root:root" for path "file:/". Usually we define "hdfs:" prefix for an HDFS path.
MetaException(message:java.security.AccessControlException: Permission denied: user=hive, path="file:/":root:root:drwxr-xr-x)
This indicates that your "UPLOAD TABLE" command may not be right or it might be using Local File System path instead of hdfs PATH. Suspecting that your Upload Table query is trying to write to local File system instead of HDFS file system.
So can you please share your "hive view" Query so that we know what exactly you are trying to perform.
Also using which user account you have logged in to Ambari UI / Hive View UI (as "admin" user or as "hive" user ?)
Yes I was uploading the table from my laptop's desktop, i.e. Local File System. I didn't use any Hive Query to upload the table. In the Hive View of Ambari UI, there is an option of "UPLOAD TABLE". I clicked on that option, then I set the Field Delimiter as Tab delimited and then clicked on "Upload Table". After this, I got the error that I mentioned.
The account the I was using was maria_dev. However, I got a similar error in case of admin too.
Can you pelase share the exact Screenshot and the steps that you are following while doing "Upload Table".
I just tried and it seems to be working without any issue.
We want you to share the screenshot of the steps that you are following because we want to know from where the "file:/" path is coming into picture.
1st step : I goto Hive View and then choose Upload Table option
2nd step : I set the field delimiter in the settings option :
3rd step : I choose the file located on my laptop, i.e. local file system :
4th step : I hit the upload table option
And then I get the following error :
Is Ranger service enabled on you cluster?
If Ranger is enabled, I'd suggest you go through this article :