Support Questions

Find answers, ask questions, and share your expertise

"E090 HDFS020 Could not write file" error occurred

avatar
Explorer

Hi, I just installed Ranger from Ambari UI.

When I throw query, "E090 HDFS020 Could not write file" error occurred. I already checked and applied the Support KB as follow: https://community.hortonworks.com/content/supportkb/49578/e090-hdfs020-could-not-write-file-useradmi... However, the error still exist.

Before Ranger was installed, there were no error like that.

10036-スクリーンショット-2016-12-02-191249.png

I have no idea for this. Cloud you please tell me what I should check?

Regards, Takashi

1 ACCEPTED SOLUTION

avatar
Explorer

Hi, I configured hadoop.proxyuser.root.hosts=* of HDFS configs in Ambari.

Then the issue was gone.

Thank you for everyone!

View solution in original post

10 REPLIES 10

avatar

can you try creating a ranger hdfs policy for the admin user on resource /user/admin with recursive true

avatar

@Takashi Nasu

Create the directory /user/admin under hdfs as below and then try running the query:

hdfs dfs -mkdir /user/admin

hdfs dfs -chown -R admin:admin /user/admin

avatar
Explorer

Thank you! /user/admin directory is as below:

[admin@ip-192-168-xxx ~]$ hdfs dfs -chown admin:admin /user/admin
[admin@ip-192-168-xxx ~]$ hdfs dfs -ls  /user/admin
Found 3 items
drwx------   - admin hdfs          0 2016-12-01 11:47 /user/admin/.Trash
drwxr-xr-x   - admin hdfs          0 2016-12-01 11:52 /user/admin/.hiveJars
drwxr-xr-x   - admin hdfs          0 2016-12-01 11:39 /user/admin/hive

Ranger policy is as below:

ranger1.png

ranger2.png

ranger3.png

I don't understand what is the root cause...

avatar
@Takashi Nasu

Are you able to run the queries now? If yes, then issue could be missing HDFS directory and no Ranger policy in place.

avatar
Explorer
@Sindhu

Sorry, I still have the issue. I'm getting the same error now...

avatar
Master Guru

Are you running Apache Atlas?

avatar
Explorer

Hi, I configured hadoop.proxyuser.root.hosts=* of HDFS configs in Ambari.

Then the issue was gone.

Thank you for everyone!

avatar
Contributor

Thanks.

It also works for me.

I am using HDP 2.6.2 with Ambari 2.5,

After installing, the default proxy value is

hadoop.proxyuser.root.hosts=ambari1.ec2.internal

After changing to

hadoop.proxyuser.root.hosts=*

Error is resolved.

avatar
Contributor

@Takashi Nasu is the issue resolved? I am having the similar issue. I tried all the above solution given by folks but still facing the same error. I do not have security installed in my system and i believe its just permission issue but couldnt able to identify where it could be. Can @Deepak Sharma or any one help to resolve this please