Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

Issue while using Hive View in Ambari console

avatar
Expert Contributor

I am facing below issue while using Hive view from Ambari console:

E090 HDFS020 Could not write file /user/admin/hive/jobs/hive-job-3-2016-02-12_12-55/query.hql [HdfsApiException]

Searching through the guides I found out that HDFS user directory set up needs to be done. Following that guide am issuing command hadoop fs -mkdir /user/admin from HDFS user but it throws the below error.

-bash-4.1$ hadoop fs -mkdir /user/admin mkdir: `/user/admin': Input/output error

Need your help on this issue.

Note : Using HDP 2.3.4.0 and it was configured using Amabri and also in that host HDFS client is running.

1 ACCEPTED SOLUTION

avatar

Validate HDFS configuration and make sure HDFS service is running.

Input/Output error can be thrown because of multiple reasons (wrong config, NN not available,....)

Could you please check the HDFS Namenode Log and see if any error/exception is shown.

View solution in original post

12 REPLIES 12

avatar
New Member

Thanks. hadoop.proxyuser.root.hosts=* also worked for me, too.

avatar
Rising Star

yes for some reason after enabling ranger , it will remoce the hadoop.proxyusers.root.hosts settings even if you had it before.... annoying

avatar
New Member

Yes it worked for me too by changing this setting in HDFS