Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Issue while using Hive View in Ambari console

avatar
Expert Contributor

I am facing below issue while using Hive view from Ambari console:

E090 HDFS020 Could not write file /user/admin/hive/jobs/hive-job-3-2016-02-12_12-55/query.hql [HdfsApiException]

Searching through the guides I found out that HDFS user directory set up needs to be done. Following that guide am issuing command hadoop fs -mkdir /user/admin from HDFS user but it throws the below error.

-bash-4.1$ hadoop fs -mkdir /user/admin mkdir: `/user/admin': Input/output error

Need your help on this issue.

Note : Using HDP 2.3.4.0 and it was configured using Amabri and also in that host HDFS client is running.

1 ACCEPTED SOLUTION

avatar

Validate HDFS configuration and make sure HDFS service is running.

Input/Output error can be thrown because of multiple reasons (wrong config, NN not available,....)

Could you please check the HDFS Namenode Log and see if any error/exception is shown.

View solution in original post

12 REPLIES 12

avatar

Validate HDFS configuration and make sure HDFS service is running.

Input/Output error can be thrown because of multiple reasons (wrong config, NN not available,....)

Could you please check the HDFS Namenode Log and see if any error/exception is shown.

avatar
Guru

@rajdip chaudhuri Can you please paste /var/log/hadoop/hdfs/<latest_NN>.log and /var/log/ambari-server/ambari-server.log files content. Also please check whether hdfs and ambari service is running.

avatar

@rajdip chaudhuri

yes you need to follow the documentation to do initial setup.

As suggested by Jonas in the above update, please verify and ensure that the HDFS is good by running a

hdfs fsck /user

command as hdfs user. You could check this in the Namenode UI as well.

Also, I hope you are trying to create the folder as 'hdfs' user or with a user who has permission to create /user/admin folder. Are you able to do a

hdfs dfs -ls -R /user

?

Thanks & Rgds

Venkat

avatar
New Contributor

when i tried it , there are two properties , one is Key and one is Value , what to put in the key ?

avatar
Expert Contributor

I have same issue on HDP 2.5 & Ambari 2.4.0.1.

I have created all the necessary HDFS directories and grant proper permission, but a simple 'show tables' query just doesn't work. Digging into HDFS logs, I found Ambari Hive View didn't create staging directory under /user/admin/hive/jobs. It should create hive-job-6-2016-12-16_06-15 directory before trying to write the hql file.

$ tail -f hdfs-audit.log | grep '/user/admin'
2016-12-16 06:15:55,156 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.0.178  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-6-2016-12-16_06-15 dst=null  perm=null proto=webhdfs

This error happens after I enabled Ranger plugin for Hive.

I also have another working Ambari Hive View on HDC. It creates the staging directories and hql properly.

$ tail -f hdfs-audit.log | grep '/user/admin'
2016-12-16 06:17:29,003 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin dst=null  perm=null proto=webhdfs
2016-12-16 06:17:31,148 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/.AUTO_HIVE_INSTANCE.defaultSettings dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,474 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17  dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,486 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.119  cmd=create  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=admin:hdfs:rw-r--r-- proto=rpc
2016-12-16 06:17:35,509 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.120  cmd=create  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=admin:hdfs:rw-r--r-- proto=rpc
2016-12-16 06:17:35,522 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,523 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=open  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,527 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.119  cmd=open  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=null proto=rpc
2016-12-16 06:17:35,582 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,583 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=open  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,587 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.119  cmd=open  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=null proto=rpc
2016-12-16 06:17:35,590 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,593 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/query.hql  dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,765 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,769 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.119  cmd=open  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=rpc
2016-12-16 06:17:35,771 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,774 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,803 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,807 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.120  cmd=open  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=rpc
2016-12-16 06:17:35,810 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs
2016-12-16 06:17:35,812 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs
2016-12-16 06:17:45,915 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs
2016-12-16 06:17:45,919 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.120  cmd=open  src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=rpc
2016-12-16 06:17:45,921 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs
2016-12-16 06:17:45,923 INFO FSNamesystem.audit: allowed=true ugi=admin (auth:PROXY) via root (auth:SIMPLE) ip=/10.0.1.207  cmd=getfileinfo src=/user/admin/hive/jobs/hive-job-19-2016-12-16_06-17/logs dst=null  perm=null proto=webhdfs

avatar
Expert Contributor

Do not know how to fix it..

Also checked NN log, no error occurred.

avatar
Expert Contributor

Solved this by setting hadoop.proxyuser.root.hosts=*.

For some reason, the HDFS request to create the directory was sent from host where neither Ambari Server nor HS2 is running. Not sure why but change this setting solved the issue.

avatar
Expert Contributor

I had the same problem with a valid Linux/HDFS user as Ambari ID, the solution worked - thanks!

avatar

Thanks your solution of changing hadoop.proxyuser.root.hosts=* also worked for me