Member since
12-28-2015
74
Posts
17
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1442 | 05-17-2017 03:15 PM | |
5796 | 03-21-2017 11:35 AM | |
13292 | 03-04-2017 09:51 AM | |
2111 | 02-09-2017 04:03 PM | |
3518 | 01-19-2017 11:24 AM |
02-02-2017
11:59 AM
I finally configured ambari for kerberos manually even when documentation says that ambari performs that setup automatically and finally worked ... I had to do this optional stept http://docs.hortonworks.com/HDPDocuments/Ambari-2.4.1.0/bk_ambari-security/content/set_up_kerberos_for_ambari_server.html Looks like it's not optional at all. thank you very much
... View more
02-01-2017
12:54 PM
@Artem Ervits I also have read that part.. it's supossed to be configured automatically when I used Ambari for kerberos setup... And yes I've restarted ambari-server. Now what I have find is that I can curl from servers using @HADOOP.INT Realm users (krb5) but cannot with @TEST.INT AD realm... but If I go to a windows machine with an user logged in TEST.INT I can access to this websites using any webbrowser.. even after remove the hadoop.auth cookie. output of curl error: [root@hadoop02 lib]# curl --negotiate -u admin:admin <a href="http://hadoop01.int:8088/ws/v1/cluster/info">http://hadoop01.int:8088/ws/v1/cluster/info</a>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
<title>Error 403 GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)</title>
</head>
<body><h2>HTTP ERROR 403</h2>
<p>Problem accessing /ws/v1/cluster/info. Reason:
<pre> GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)</pre></p><hr /><i><small>Powered by Jetty://</small></i><br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
<br/>
</body>
</html>
Ambari is suppose to use @HADOOP.INT realm user.. edited: Ambari-metrics service check went fine.
... View more
02-01-2017
11:46 AM
Hello! I have a kerberos enabled cluster with: Ambari 2.4.1 HDP stack 2.4.0 One way trust between Kerberos and AD After I have enabled HTTP authentication as described at http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_Security_Guide/content/_configuring_http_authentication_for_HDFS_YARN_MapReduce2_HBase_Oozie_Falcon_and_Storm.html I have lost information in ambari as shows the following images: If I check the logs of ambari-server.log I find a lot of entries with this message: 01 Feb 2017 12:41:47,167 WARN [ambari-metrics-retrieval-service-thread-1] RequestTargetAuthentication:88 - NEGOTIATE authentication error: No valid credentials provided (Mechanism level: No valid credentials provided (Mechanism level: Attempt to obtain new INITIATE credentials failed! (null)))
01 Feb 2017 12:41:47,169 ERROR [ambari-metrics-retrieval-service-thread-1] AppCookieManager:122 - SPNego authentication failed, can not get hadoop.auth cookie for URL: http://hadoop01.int:50070/jmx?get=Hadoop:service=NameNode,name=FSNamesystem::tag.HAState
01 Feb 2017 12:41:47,168 WARN [ambari-metrics-retrieval-service-thread-0] RequestTargetAuthentication:88 - NEGOTIATE authentication error: No valid credentials provided (Mechanism level: No valid credentials provided (Mechanism level: Attempt to obtain new INITIATE credentials failed! (null)))
01 Feb 2017 12:41:47,170 ERROR [ambari-metrics-retrieval-service-thread-0] AppCookieManager:122 - SPNego authentication failed, can not get hadoop.auth cookie for URL: http://hadoop01.int:8088/jmx
I have tried to regenerate the keytab but the problem persists. I can access these url with my browser using both, kerberos ticket or AD windows ticket without any problem. Any clue about what maybe happening? Thank you in advance
... View more
Labels:
- Labels:
-
Apache Ambari
-
Cloudera Manager
01-19-2017
11:24 AM
1 Kudo
Finally I solved this.. Due a bad sudoer configuration.. ambari was unable to execute some scripts as sudo. The following configuration was missing in sudoers: # Ambari: Core System Commands
ambari ALL=(ALL) NOPASSWD:SETENV: /usr/bin/yum,/usr/bin/zypper,/usr/bin/apt-get, /bin/mkdir, /usr/bin/test, /bin/ln, /bin/ls, /bin/chown, /bin/chmod, /bin/chgrp, /bin/cp, /usr/sbin/setenforce, /usr/bin/test, /usr/bin/stat, /bin/mv, /bin/sed, /bin/rm, /bin/kill, /bin/readlink, /usr/bin/pgrep, /bin/cat, /usr/bin/unzip, /bin/tar, /usr/bin/tee, /bin/touch, /usr/bin/mysql, /sbin/service mysqld *, /usr/bin/dpkg *, /bin/rpm *, /usr/sbin/hst *
# Ambari: Hadoop and Configuration Commands
ambari ALL=(ALL) NOPASSWD:SETENV: /usr/bin/hdp-select, /usr/bin/conf-select, /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh, /usr/lib/hadoop/bin/hadoop-daemon.sh, /usr/lib/hadoop/sbin/hadoop-daemon.sh, /usr/bin/ambari-python-wrap *
# Ambari: System User and Group Commands
ambari ALL=(ALL) NOPASSWD:SETENV: /usr/sbin/groupadd, /usr/sbin/groupmod, /usr/sbin/useradd, /usr/sbin/usermod
... View more
01-19-2017
10:17 AM
Seems to be a sudoer configuration problem if I change "NOPASSWD:SETENV: /bin/mkdir, /bin/cp, /bin/chmod, /bin/rm" for "NOPASSWD: ALL" it seems to work
... View more
01-19-2017
10:13 AM
@Jay SenSharma
The error persist even after apply that ownership
... View more
01-19-2017
10:12 AM
Hi @Sagar Shimpi, with root user works correctly. Here is my sudoer configuration related to ambari: Defaults exempt_group = ambari
Defaults: ambari !requiretty
ambari ALL=(ALL) NOPASSWD:SETENV: /bin/mkdir, /bin/cp, /bin/chmod, /bin/rm
... View more
01-19-2017
08:40 AM
1 Kudo
Hello,
I'm trying to download our cluster config files but im having a status 500 as response..
Ambari response: {
"status" : 500,
"message" : "org.apache.ambari.server.controller.spi.SystemException: Execution of \"ambari-python-wrap /var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package/scripts/spark_client.py generate_configs /var/lib/ambari-server/data/tmp/SPARK_CLIENT-configuration.json /var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package /var/lib/ambari-server/data/tmp/structured-out.json INFO /var/lib/ambari-server/data/tmp\" returned 1. java.lang.Throwable: 2017-01-19 09:17:44,499 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf\n2017-01-19 09:17:44,502 - call['ambari-python-wrap /usr/bin/hdp-select status spark-client'] {'timeout': 20}\n2017-01-19 09:17:44,540 - call returned (0, 'spark-client - 2.4.0.0-169')\n2017-01-19 09:17:44,542 - Directory['/var/lib/ambari-server/data/tmp'] {'create_parents': True}\n2017-01-19 09:17:44,564 - Creating directory Directory['/var/lib/ambari-server/data/tmp'] since it doesn't exist.\nTraceback (most recent call last):\n File \"/var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package/scripts/spark_client.py\", line 88, in <module>\n SparkClient().execute()\n File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\", line 280, in execute\n method(env)\n File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\", line 806, in generate_configs\n Directory(self.get_tmp_dir(), create_parents = True)\n File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\", line 155, in __init__\n self.env.run()\n File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line 160, in run\n self.run_action(resource, action)\n File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line 124, in run_action\n provider_action()\n File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py\", line 194, in action_create\n raise Fail(\"Applying %s failed, file %s already exists\" % (self.resource, path))\nresource_management.core.exceptions.Fail: Applying Directory['/var/lib/ambari-server/data/tmp'] failed, file /var/lib/ambari-server/data/tmp already exists\n"
} ambari-server.log output: 19 Jan 2017 09:17:44,361 INFO [ambari-client-thread-1474693] Configuration:1026 - Reading password from existing file
19 Jan 2017 09:17:44,709 ERROR [ambari-client-thread-1474693] ClientConfigResourceProvider:418 - Execution of "ambari-python-wrap /var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package/scripts/spark_client.py generate_configs /var/lib/ambari-server/data/tmp/SPARK_CLIENT-configuration.json /var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package /var/lib/ambari-server/data/tmp/structured-out.json INFO /var/lib/ambari-server/data/tmp" returned 1.
java.util.concurrent.ExecutionException: Execution of "ambari-python-wrap /var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package/scripts/spark_client.py generate_configs /var/lib/ambari-server/data/tmp/SPARK_CLIENT-configuration.json /var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package /var/lib/ambari-server/data/tmp/structured-out.json INFO /var/lib/ambari-server/data/tmp" returned 1.
at org.apache.ambari.server.controller.internal.ClientConfigResourceProvider.executeCommand(ClientConfigResourceProvider.java:491)
at org.apache.ambari.server.controller.internal.ClientConfigResourceProvider.getResources(ClientConfigResourceProvider.java:410)
at org.apache.ambari.server.controller.internal.ClusterControllerImpl$ExtendedResourceProviderWrapper.queryForResources(ClusterControllerImpl.java:966)
at org.apache.ambari.server.controller.internal.ClusterControllerImpl.getResources(ClusterControllerImpl.java:141)
at org.apache.ambari.server.api.query.QueryImpl.doQuery(QueryImpl.java:529)
at org.apache.ambari.server.api.query.QueryImpl.queryForResources(QueryImpl.java:398)
at org.apache.ambari.server.api.query.QueryImpl.execute(QueryImpl.java:222)
at org.apache.ambari.server.api.handlers.ReadHandler.handleRequest(ReadHandler.java:77)
at org.apache.ambari.server.api.services.BaseRequest.process(BaseRequest.java:145)
at org.apache.ambari.server.api.services.BaseService.handleRequest(BaseService.java:126)
at org.apache.ambari.server.api.services.BaseService.handleRequest(BaseService.java:90)
at org.apache.ambari.server.api.services.ComponentService.createClientConfigResource(ComponentService.java:226)
at org.apache.ambari.server.api.services.ComponentService.getComponent(ComponentService.java:79)
at sun.reflect.GeneratedMethodAccessor260.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:302)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:137)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1542)
at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1473)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1419)
at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1409)
at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)
at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1507)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariAuthorizationFilter.doFilter(AmbariAuthorizationFilter.java:257)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:103)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:45)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.jwt.JwtAuthenticationFilter.doFilter(JwtAuthenticationFilter.java:96)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilter(BasicAuthenticationFilter.java:150)
at org.apache.ambari.server.security.authentication.AmbariAuthenticationFilter.doFilter(AmbariAuthenticationFilter.java:88)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.apache.ambari.server.security.authorization.AmbariUserAuthorizationFilter.doFilter(AmbariUserAuthorizationFilter.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:87)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:192)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:160)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.api.MethodOverrideFilter.doFilter(MethodOverrideFilter.java:72)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.api.AmbariPersistFilter.doFilter(AmbariPersistFilter.java:47)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.apache.ambari.server.security.AbstractSecurityHeaderFilter.doFilter(AbstractSecurityHeaderFilter.java:109)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.eclipse.jetty.servlets.UserAgentFilter.doFilter(UserAgentFilter.java:82)
at org.eclipse.jetty.servlets.GzipFilter.doFilter(GzipFilter.java:294)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1478)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:499)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:427)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:212)
at org.apache.ambari.server.controller.AmbariHandlerList.processHandlers(AmbariHandlerList.java:201)
at org.apache.ambari.server.controller.AmbariHandlerList.handle(AmbariHandlerList.java:139)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:370)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:973)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1035)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:641)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:231)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.Throwable: 2017-01-19 09:17:44,499 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-01-19 09:17:44,502 - call['ambari-python-wrap /usr/bin/hdp-select status spark-client'] {'timeout': 20}
2017-01-19 09:17:44,540 - call returned (0, 'spark-client - 2.4.0.0-169')
2017-01-19 09:17:44,542 - Directory['/var/lib/ambari-server/data/tmp'] {'create_parents': True}
2017-01-19 09:17:44,564 - Creating directory Directory['/var/lib/ambari-server/data/tmp'] since it doesn't exist.
Traceback (most recent call last):
File "/var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package/scripts/spark_client.py", line 88, in <module>
SparkClient().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 806, in generate_configs
Directory(self.get_tmp_dir(), create_parents = True)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 194, in action_create
raise Fail("Applying %s failed, file %s already exists" % (self.resource, path))
resource_management.core.exceptions.Fail: Applying Directory['/var/lib/ambari-server/data/tmp'] failed, file /var/lib/ambari-server/data/tmp already exists
... 102 more
ambari-audit.log output: 2017-01-19T09:17:44.710+0100, User(admin), RemoteIp(10.0.1.10), Operation(Client config download), RequestType(GET), url(http://pbigd01.int:8080/api/v1/clusters/pro/services/SPARK/components/SPARK_CLIENT?format=client_config_tar), ResultStatus(500 Internal Server Error), Reason(org.apache.ambari.server.controller.spi.SystemException: Execution of "ambari-python-wrap /var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package/scripts/spark_client.py generate_configs /var/lib/ambari-server/data/tmp/SPARK_CLIENT-configuration.json /var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package /var/lib/ambari-server/data/tmp/structured-out.json INFO /var/lib/ambari-server/data/tmp" returned 1. java.lang.Throwable: 2017-01-19 09:17:44,499 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-01-19 09:17:44,502 - call['ambari-python-wrap /usr/bin/hdp-select status spark-client'] {'timeout': 20}
2017-01-19 09:17:44,540 - call returned (0, 'spark-client - 2.4.0.0-169')
2017-01-19 09:17:44,542 - Directory['/var/lib/ambari-server/data/tmp'] {'create_parents': True}
2017-01-19 09:17:44,564 - Creating directory Directory['/var/lib/ambari-server/data/tmp'] since it doesn't exist.
Traceback (most recent call last):
File "/var/lib/ambari-server/resources/common-services/SPARK/1.2.1/package/scripts/spark_client.py", line 88, in <module>
SparkClient().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 806, in generate_configs
Directory(self.get_tmp_dir(), create_parents = True)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 194, in action_create
raise Fail("Applying %s failed, file %s already exists" % (self.resource, path))
resource_management.core.exceptions.Fail: Applying Directory['/var/lib/ambari-server/data/tmp'] failed, file /var/lib/ambari-server/data/tmp already exists
), Service(SPARK), Component(SPARK_CLIENT)
I also have tested to remove the the /var/lib/ambari-server/data/tmp but ambari just keep showing the same error after creating it with the following file inside: total 176K
drwxr-xr-x 2 ambari ambari 4.0K Jan 19 09:29 .
drwxr-xr-x. 5 ambari root 4.0K Jan 19 09:29 ..
-rw-r--r-- 1 ambari ambari 168K Jan 19 09:29 SPARK_CLIENT-configuration.json Ambari-server is executed as ambari. Any idea about what is possible going on?
... View more
Labels:
- Labels:
-
Apache Ambari
11-25-2016
11:42 AM
Well finally I solved this.
The "Connecting to namenode via http://<hostname>:50070/fsck?ugi=hdfs█s=1&files=1&locations=1&path=%2Fuser%2Fbigdata%2F.<directory>" is the stderr output of the command so redirecting stderr to /dev/null does the work :). for i in $(hadoop fs -ls /user/bigdata/ | grep drwx | awk '{print $8}'); do echo "$i $(hdfs fsck $i -blocks -files -locations 2> /dev/null | grep BP- | wc -l)" ; done
... View more
11-25-2016
10:33 AM
Hello , I usually use for loops to get info of some folders. For example: I had to find which folder inside /user/bigdata was consuming high number of blocks due to small files. So i used this: for i in $(hadoop fs -ls /user/bigdata/ | grep drwx | awk '{print $8}'); do echo "$i $(hdfs fsck $i -blocks -files -locations | grep BP- | wc -l)" ; done Getting a lot of "Connecting to namenode via http://<hostname>:50070/fsck?ugi=hdfs&blocks=1&files=1&locations=1&path=%2Fuser%2Fbigdata%2F.<directory>" messages. Does exist any way of hide this message? I currently have to redirect the output to a file and then use cat to read a clear infomation. Thank you in advance.
... View more
Labels:
- Labels:
-
Apache Hadoop
- « Previous
- Next »