Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

After enable kerberos, most of the service cannot be started for HDP Sandbox 2.6.4

avatar
New Contributor

I'm installing a HDP 2.6.4 sandbox in Azure for testing. I install the the sandbox and everything is ok.

Then I enable kerberos, seems everything works ok but failed in the last step: start service. It show the it cannot start the Timeline Server .

The kerberos server is ok, for it is used for some existing Mongo DB and CDH kerberos authentication for a long time.

Below is the detail error log, anyone has idea/suggestion for this type error ? I'm very appreciate for your input and comments.

==========================Timeline Server=====================

resource_management.libraries.providers.hdfs_resource.WebHDFSCallException: Execution of 'curl -sS -L -w '%{http_code}' -X PUT --negotiate -u : 'http://sandbox-hdp.hortonworks.com:50070/webhdfs/v1/ats/done?op=SETPERMISSION&permission=755'' returned status_code=403.

{ "RemoteException": {

"exception": "AccessControlException",

"javaClassName": "org.apache.hadoop.security.AccessControlException",

"message": "Permission denied. user=dr.who is not the owner of inode=done"

}}

And I find many other services have not been started , so I try to start them manually, but all failed.

==========================MapReduce2 - History Server=====================

2018-04-09 05:43:31,271 INFO service.AbstractService (AbstractService.java:noteFailure(272)) - Service org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager failed in state INITED; cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating done directory: [hdfs://sandbox-hdp.hortonworks.com:8020/mr-history/done] org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating done directory: [hdfs://sandbox-hdp.hortonworks.com:8020/mr-history/done] at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.tryCreatingHistoryDirs(HistoryFileManager.java:639) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.createHistoryDirs(HistoryFileManager.java:585) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceInit(HistoryFileManager.java:550) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:94) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:143) at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.launchJobHistoryServer(JobHistoryServer.java:221) at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.main(JobHistoryServer.java:231)

Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "sandbox-hdp.hortonworks.com/172.17.0.2"; destination host is: "sandbox-hdp.hortonworks.com":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:785) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558) at org.apache.hadoop.ipc.Client.call(Client.java:1498) at org.apache.hadoop.ipc.Client.call(Client.java:1398) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:823)

===============================Yarn - Resource Manager======================

2018-04-09 06:25:39,146 INFO security.UserGroupInformation (UserGroupInformation.java:loginUserFromKeytab(1101)) - Login successful for user rm/sandbox-hdp.hortonworks.com@TICQUEST.CLOUDAPP.NET using keytab file /etc/security/keytabs/rm.service.keytab

......

2018-04-09 06:25:55,933 INFO service.AbstractService (AbstractService.java:noteFailure(272)) - Service org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl failed in state STARTED; cause: java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "sandbox-hdp.hortonworks.com/172.17.0.2"; destination host is: "sandbox-hdp.hortonworks.com":8020; java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "sandbox-hdp.hortonworks.com/172.17.0.2"; destination host is: "sandbox-hdp.hortonworks.com":8020; at

......

2018-04-09 06:25:55,938 INFO service.AbstractService (AbstractService.java:noteFailure(272)) - Service org.apache.hadoop.yarn.server.resourcemanager.metrics.SystemMetricsPublisher failed in state STARTED; cause: org.apache.hadoop.service.ServiceStateException: java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "sandbox-hdp.hortonworks.com/172.17.0.2"; destination host is: "sandbox-hdp.hortonworks.com":8020; org.apache.hadoop.service.ServiceStateException: java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "sandbox-hdp.hortonworks.com/172.17.0.2"; destination host is: "sandbox-hdp.hortonworks.com":8020; at org.apache.hadoop.service.ServiceStateException.convert(ServiceStateException.java:59) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:204) at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120) at

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Vincent Hu

Please do the following if the cluster is managed by Ambari, this should be added in:

Ambari > HDFS > Configurations>Advanced core-site > Add Property

hadoop.http.staticuser.user=yarn

Restart any stale service and retry

View solution in original post

2 REPLIES 2

avatar
Master Mentor

@Vincent Hu

Please do the following if the cluster is managed by Ambari, this should be added in:

Ambari > HDFS > Configurations>Advanced core-site > Add Property

hadoop.http.staticuser.user=yarn

Restart any stale service and retry

avatar
New Contributor

Yes, it works. thanks @Geoffrey Shelton Okot.

But the preperty should be added to Customer core-site, not Advanced core-site

Ambari > HDFS > Configs>Custom core-site > Add Property.

Seems this config cause many similar issues, why HDP does not add this to config for enable kerberos automatically ?