Created 12-29-2015 08:55 AM
Hi Team,
I am facing following errors while browsing Hive UI and and File browser in Hue.
HiveUI Error -
QueryServerException at /beeswax/ Bad status for request TOpenSessionReq(username='hue', password='hue', client_protocol=4, configuration={'hive.server2.proxy.user': u'admin'}): TOpenSessionResp(status=TStatus(errorCode=0, errorMessage='Failed to open new session:
File Browser Error -
WebHdfsException at /filebrowser/ SecurityException: Failed to obtain user group information: org.apache.hadoop.security.authorize.AuthorizationException: User: hue is not allowed to impersonate admin (error 403)
Note : - Installation and configuration for Hue has been done using below URL
Created 02-08-2016 06:24 AM
Divakar: Thanks for your consideration, but as I have mentioned above problem has been resolved by running build/env/bin/hue syncdb --noinput
Created 12-29-2015 10:35 AM
Issue is related to impersonation. Ensure to follow these steps here.
Created 12-29-2015 11:31 AM
Pradeep,
There are multiple hive-site.xml files found which one we can use to do hue configuration.
1./etc/hive/2.3.2.0-2950/0
2./etc/hive/conf
3./etc/hive/conf.install
Created 12-29-2015 02:55 PM
Try echo $HIVE_CONF_DIR and view the output. It shoudl be /etc/hive/conf
Created 12-29-2015 03:39 PM
The hue proxy hosts/groups can easily be managed with Ambari, you will need to add properties within the HDFS, Oozie and Hive service configurations and then ensure those are pushed out to your hosts.
Created 12-29-2015 11:48 PM
@Nilesh Do it through Ambari, don't worry about these multiple hive-site.xml. In Hue.ini, its should be /etc/hive/conf
Created 01-26-2016 04:56 PM
Hi ,
I am also facing same issue and tried above solution with ambari, but unfortunately it did not work. So any other help please.
raise QueryServerException(Exception('Bad status for request %s:\n%s' % (req, res)), message=message)
QueryServerException: Bad status for request TOpenSessionReq(username='hue', password='hue', client_protocol=4, configuration={'hive.server2.proxy.user': u'hue'}):
TOpenSessionResp(status=TStatus(errorCode=0, errorMessage='Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: hive from IP 192.168.56.42', sqlState=None, infoMessages=['*org.apache.hive.service.cli.HiveSQLException:Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: hive from IP 192.168.56.42:13:12', 'org.apache.hive.service.cli.session.SessionManager:openSession:SessionManager.java:266', 'org.apache.hive.service.cli.CLIService:openSessionWithImpersonation:CLIService.java:202', 'org.apache.hive.service.cli.thrift.ThriftCLIService:getSessionHandle:ThriftCLIService.java:402', 'org.apache.hive.service.cli.thrift.ThriftCLIService:OpenSession:ThriftCLIService.java:297', 'org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1253', 'org.apache.hive.service.cli.thrift.TCLIService$Processor$OpenSession:getResult:TCLIService.java:1238', 'org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39', 'org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39', 'org.apache.hive.service.auth.TSetIpAddressProcessor:process:TSetIpAddressProcessor.java:56', 'org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:285', 'java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1145', 'java.util.concurrent.ThreadPoolExecutor$Worker:ru
Created 02-02-2016 12:10 PM
Why not use Ambari Views, Hue is deprecated in HDP 2.3 @Saurabh Kumar
Created 02-08-2016 05:42 AM
Yes,you are right @artem Evits we should use views but for that we should have a dedicated server which would manage views,otherwise it may create overload on our ambari servers and we do some changes and restart ambari then users also may get impacted.
Created 01-27-2016 07:32 PM
make this change and will work.
hadoop.proxyuser.hive.groups = *