Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hive View Not Populating Database Explorer

avatar
Expert Contributor

252-2015-10-16-09-29-17.png

I've downloaded the latest sandbox and I'm running through a tutorial. It seems that the Hive view in Ambari can execute commands, but the Database Explorer will not show any of the tables that are present.

However I can still execute statements and receive the results. I just can't see the list of tables. It's not throwing any errors up on the screen so I'm not sure what's wrong.

Is there any way to fix this?

1 ACCEPTED SOLUTION

avatar
Expert Contributor

It showed all of the databases when I ran the mysql command

It could possibly be an issue with user permissions - and the 'admin' user that the view is using does not have access to the hive databases or tables.

I found this out by ssh'ing into the sandbox and I tried immediately running hive

[root@sandbox ~]# hive 

gave me the error

org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root":hdfs:hdfs:drwxr-xr-x

However if I did a "su hdfs" or "su hive", it would allow me to run hive and see the tables.

Where I'm stumped is why I could run

select * from table_name;

and still have all the results returned.

View solution in original post

11 REPLIES 11

avatar
Master Mentor

@zblanco@hortonworks.com

Please launch mysql cli and check for databases ? Just want to make sure that you have privilege to see show databases;

avatar

Could you share some log and configurations? I'd be interested in the log that is produced when you open that view or press the refresh button of the database explorer

avatar

The tables that you are expecting, were they created from the Hive View? I ran into issues with the tables getting refreshed. Can you kill session and launch new?

avatar
Expert Contributor

I think it's an issue with user permissions, but I don't know why I can still run commands like

select * from table_name;

avatar

Could table_name be an external table with location that is readable by admin?

avatar
Expert Contributor

I suppose so? I was really just using it as a place holder. See screenshot in original question. That "trucks_stage" table, I was able to read and load from.

avatar
Expert Contributor

It showed all of the databases when I ran the mysql command

It could possibly be an issue with user permissions - and the 'admin' user that the view is using does not have access to the hive databases or tables.

I found this out by ssh'ing into the sandbox and I tried immediately running hive

[root@sandbox ~]# hive 

gave me the error

org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root":hdfs:hdfs:drwxr-xr-x

However if I did a "su hdfs" or "su hive", it would allow me to run hive and see the tables.

Where I'm stumped is why I could run

select * from table_name;

and still have all the results returned.

avatar
Expert Contributor

I ended up restarting the Sandbox and my problem was eventually fixed.

avatar

Have you followed the instructions from Ambari documentation for configuring Hive view. It states there:

Ambari views use the doAs option for commands. This option enables the Ambari process user to impersonate the Ambari logged-in user. To avoid receiving permissions errors for job submissions and file save operations, you must create HDFS users for all Ambari users that use the views.

The HDFS permission error that you are showing above:

org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root":hdfs:hdfs:drwxr-xr-x

It clearly shows that the /user/root which is the default user home of root user on HDFS is not owned by root, that's a problem. Can you make sure that /user/root is created with ownership of root user. Also if the authorization mode in Hive is StorageBased then /apps/hive/warehouse should be writable by root. This is usually accomplished by having the executing user in the same group as the one that has write permissions on /apps/hive/warehouse.

I would also guess that in your current setup any query that requires a job to be run will fail. Simple metadata queries would succeed.