Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

configure workflow manager and oozie to working with HA hdfs, hive, etc

configure workflow manager and oozie to working with HA hdfs, hive, etc

Contributor

After enabling HA on our hdp cluster we get following error from oozie when configuring workflow from ambari workflow manager:

2016-10-18 17:17:29,956 WARN V1JobsServlet:523 - SERVER[hdp-name1.lab.croc.ru] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] URL[POST http://hdp-name1.lab.croc.ru:11000/oozie/v2/jobs] user error, java.net.UnknownHostException: null

java.lang.IllegalArgumentException: java.net.UnknownHostException: null

at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:411)

at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:311)

at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:688)

at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:629)

at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:159)

at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2761)

at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)

at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2795)

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2777)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:386)

at org.apache.oozie.service.HadoopAccessorService$4.run(HadoopAccessorService.java:577)

at org.apache.oozie.service.HadoopAccessorService$4.run(HadoopAccessorService.java:575)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)

at org.apache.oozie.service.HadoopAccessorService.createFileSystem(HadoopAccessorService.java:575)

at org.apache.oozie.service.AuthorizationService.authorizeForApp(AuthorizationService.java:374)

at org.apache.oozie.servlet.BaseJobServlet.checkAuthorizationForApp(BaseJobServlet.java:260)

at org.apache.oozie.servlet.BaseJobsServlet.doPost(BaseJobsServlet.java:99)

at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)

at org.apache.oozie.servlet.JsonRestServlet.service(JsonRestServlet.java:304)

at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)

at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)

at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)

at org.apache.oozie.servlet.AuthFilter$2.doFilter(AuthFilter.java:171)

at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:614)

at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:573)

at org.apache.oozie.servlet.AuthFilter.doFilter(AuthFilter.java:176)

at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)

at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)

at org.apache.oozie.servlet.HostnameFilter.doFilter(HostnameFilter.java:86)

at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)

at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)

at org.apache.oozie.servlet.OozieXFrameOptionsFilter.doFilter(OozieXFrameOptionsFilter.java:48)

at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)

at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)

at org.apache.oozie.servlet.OozieCSRFFilter.doFilter(OozieCSRFFilter.java:62)

at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)

at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)

at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)

at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)

at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)

at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)

at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)

at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)

at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)

at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:620)

at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.net.UnknownHostException: null

... 50 more

Which configuration changes we are missing?

8 REPLIES 8

Re: configure workflow manager and oozie to working with HA hdfs, hive, etc

New Contributor

is there an update on that?

Im facing the same issue.

hdp2.5 HA hdfs oozie workflow view

Re: configure workflow manager and oozie to working with HA hdfs, hive, etc

Contributor

Unfortunately we still haven't solution. Our version of HDP is the same.

Re: configure workflow manager and oozie to working with HA hdfs, hive, etc

New Contributor

do you have a workarround? E.g. hue or falcon or something?

Re: configure workflow manager and oozie to working with HA hdfs, hive, etc

Expert Contributor

We also have this error, same HDP 2.5

Re: configure workflow manager and oozie to working with HA hdfs, hive, etc

Contributor

We are used oozie command line utility.

Re: configure workflow manager and oozie to working with HA hdfs, hive, etc

New Contributor

Hi all,

Someone of you have a workaround about this error ?

thanks

Re: configure workflow manager and oozie to working with HA hdfs, hive, etc

Rising Star

Re: configure workflow manager and oozie to working with HA hdfs, hive, etc

New Contributor

I've configured ambari with kerberos, and principal is ambari-server@org.com.

HDFS proxyuser set to ambari-server for hosts and groups. The file view and hive views work correctly.

Perpahs probably the problem is that my ambari-server processus is running under 'root'?

Don't have an account?
Coming from Hortonworks? Activate your account here