Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

hue is not allowed to impersonate (403)



this is what i have installed, from "About Hue" page:

Hue 2.6.1-3485
HDP 2.3.4
Hadoop 2.7.1
Pig 0.15.0
Hive-Hcatalog 1.2.1
Oozie 4.2.0
Ambari 2.2.1
HBase 1.1.2
Knox 0.6.0
Storm 0.10.0
Falcon 0.6.1

I have been trying to fix this error that shows whenever I try to access the filebrowser in HUE:

File "/usr/lib/hue/build/env/lib/python2.6/site-packages/Django-1.2.3-py2.6.egg/django/core/handlers/" in get_response
  100.                     response = callback(request, *callback_args, **callback_kwargs)
File "/usr/lib/hue/apps/filebrowser/src/filebrowser/" in index
  97.   if not request.fs.isdir(path):
File "/usr/lib/hue/desktop/libs/hadoop/src/hadoop/fs/" in isdir
  220.     sb = self._stats(path)
File "/usr/lib/hue/desktop/libs/hadoop/src/hadoop/fs/" in _stats
  205.       raise ex
Exception Type: WebHdfsException at /filebrowser/
Exception Value: SecurityException: Failed to obtain user group information: User: hue is not allowed to impersonate hue (error 403)

'hue' is the first user in hue, so it is superuser.

Installation and configuration, has been done following intructions in this link:

But i have tryed even cloudera's tuts, but didn't success.

What am I missing?

Can you help me fix this, so I can access filebrowser, hcatalog and beeswax...


Expert Contributor

@Abiel Flrs The error "User: hue is not allowed to impersonate hue" also signifies that there is no home directory exist for user hue on HDFS.

You can do create (by logging in hdfs user)

su - hdfs -c "hdfs dfs -mkdir /user/hue"

su - hdfs -c "hdfs dfs -chown hue:hadoop /user/hue" [ Or you can make group owner as hdfs ]

su - hdfs -c "hdfs dfs -chmod 755 /user/hue"

means hue user belongs to group hadoop/hdfs

The other error (Only in case of secure Cluster ) ExceptionType:WebHdfsException Security exception may occur

if this is not set for WebHDFS Authorization: auth=KERBEROS;proxyuser=hue

If you are trying access hive by any chance from hue set this param hive.server2.enable.impersonation

View solution in original post


Master Mentor

Please go to hdfs > config > custom core site and add

Hadoop.proxyuser.hue.hosts = *

hadoop.proxyuser.hue.groups = *


I have done this before. But through the command line. Seems that didn't save the changes or something. After checking and applying I still get a similar error.

Error here:

Exception Type: WebHdfsException at /filebrowser/
Exception Value: SecurityException: Failed to obtain user group information: Unauthorized connection for super-user: hue from IP (error 403)

Seems like the solution for this is exactly the same you handed to me. But its not working. Is there something else I can do?

By the way, thank for the help so far.

Super Collaborator

Ambari overwrites the configs in xml files. You need to use Ambari to add those properties (and make any changes in the configuration in the future)


Yes, I used ambari to change those parameters.

Rising Star

@Abiel Flrs

You need to modify the properties from Ambari and restart any stale services afterwards.



What I meant is that I had done it from the command line, but then @Artem Ervits told me to change it. Since his indications are not for command line ("hdfs > config > custom core site"), I did it through ambari configuration, restarted all affected services, and restarted hue from command line. Then tryied again, and this is what shows up:

Unauthorized connection for super-user

Rising Star

@Abiel Flrs

Please check the values set for the following two properties from Ambari.



value is true in both cases.

Rising Star

@Abiel Flrs

Check the OS level user/group mapping as well.

# id hue
# grep hue /etc/passwd
# grep hue /etc/group