Created 04-09-2016 09:37 PM
Hi!
this is what i have installed, from "About Hue" page:
Hue | 2.6.1-3485 |
HDP | 2.3.4 |
Hadoop | 2.7.1 |
Pig | 0.15.0 |
Hive-Hcatalog | 1.2.1 |
Oozie | 4.2.0 |
Ambari | 2.2.1 |
HBase | 1.1.2 |
Knox | 0.6.0 |
Storm | 0.10.0 |
Falcon | 0.6.1 |
I have been trying to fix this error that shows whenever I try to access the filebrowser in HUE:
Traceback: File "/usr/lib/hue/build/env/lib/python2.6/site-packages/Django-1.2.3-py2.6.egg/django/core/handlers/base.py" in get_response 100. response = callback(request, *callback_args, **callback_kwargs) File "/usr/lib/hue/apps/filebrowser/src/filebrowser/views.py" in index 97. if not request.fs.isdir(path): File "/usr/lib/hue/desktop/libs/hadoop/src/hadoop/fs/webhdfs.py" in isdir 220. sb = self._stats(path) File "/usr/lib/hue/desktop/libs/hadoop/src/hadoop/fs/webhdfs.py" in _stats 205. raise ex Exception Type: WebHdfsException at /filebrowser/ Exception Value: SecurityException: Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: User: hue is not allowed to impersonate hue (error 403)
'hue' is the first user in hue, so it is superuser.
Installation and configuration, has been done following intructions in this link:
But i have tryed even cloudera's tuts, but didn't success.
What am I missing?
Can you help me fix this, so I can access filebrowser, hcatalog and beeswax...
Created 04-12-2016 12:27 PM
@Abiel Flrs The error "User: hue is not allowed to impersonate hue" also signifies that there is no home directory exist for user hue on HDFS.
You can do create (by logging in hdfs user)
su - hdfs -c "hdfs dfs -mkdir /user/hue"
su - hdfs -c "hdfs dfs -chown hue:hadoop /user/hue" [ Or you can make group owner as hdfs ]
su - hdfs -c "hdfs dfs -chmod 755 /user/hue"
means hue user belongs to group hadoop/hdfs
The other error (Only in case of secure Cluster ) ExceptionType:WebHdfsException Security exception may occur
if this is not set for WebHDFS Authorization: auth=KERBEROS;proxyuser=hue
If you are trying access hive by any chance from hue set this param hive.server2.enable.impersonation
Created 04-09-2016 10:17 PM
Please go to hdfs > config > custom core site and add
Hadoop.proxyuser.hue.hosts = *
hadoop.proxyuser.hue.groups = *
Created 04-10-2016 09:54 PM
I have done this before. But through the command line. Seems that didn't save the changes or something. After checking and applying I still get a similar error.
Error here:
Exception Type: WebHdfsException at /filebrowser/ Exception Value: SecurityException: Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: Unauthorized connection for super-user: hue from IP 127.0.0.1 (error 403)
Seems like the solution for this is exactly the same you handed to me. But its not working. Is there something else I can do?
By the way, thank for the help so far.
Created 04-11-2016 10:36 AM
Ambari overwrites the configs in xml files. You need to use Ambari to add those properties (and make any changes in the configuration in the future)
Created 04-11-2016 02:36 PM
Yes, I used ambari to change those parameters.
Created 04-11-2016 09:02 AM
You need to modify the properties from Ambari and restart any stale services afterwards.
Created 04-11-2016 02:35 PM
Yes,
What I meant is that I had done it from the command line, but then @Artem Ervits told me to change it. Since his indications are not for command line ("hdfs > config > custom core site"), I did it through ambari configuration, restarted all affected services, and restarted hue from command line. Then tryied again, and this is what shows up:
Unauthorized connection for super-user
Created 04-11-2016 03:49 PM
Please check the values set for the following two properties from Ambari.
hive.server2.enable.impersonation dfs.webhdfs.enabled
Created 04-11-2016 05:02 PM
value is true in both cases.
Created 04-11-2016 05:34 PM
Check the OS level user/group mapping as well.
# id hue # grep hue /etc/passwd # grep hue /etc/group