Member since
12-22-2017
34
Posts
2
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3135 | 11-06-2019 02:25 PM | |
8504 | 02-02-2018 06:10 PM |
11-06-2019
02:25 PM
1 Kudo
There is no issue with ambari agent.
... View more
11-06-2019
02:11 PM
Sorry for the late(very late) update on the this. The issue was resolved by myself.
... View more
09-03-2019
01:43 PM
Are you still having the issue? If so, can you share your spark interpreter config?
... View more
06-22-2018
05:08 PM
Hi, I have the same issue. But to work on it, I didn't have the conf directory created. I am struck here.
... View more
04-27-2018
08:25 PM
Is it removed from HDP 2.6.4 GA version too as I don't see the zeppelin view
... View more
04-27-2018
08:06 PM
seems like this issue got addressed in HDP 2.6.4 as I dind't see the above said scenario
... View more
04-27-2018
06:40 PM
Sonu Sahi you have to provide the Hiveserver2 server name and the port (default is 10000)
... View more
04-24-2018
10:30 PM
@Shyam Sunder Rai I set the parameter to Group member attribute* (memberUid): member.
... View more
04-24-2018
05:53 PM
Hi Rishi, Yes.. I am able to telnet and if I use the vastool command, I can see the users and groups and able to do ambari-server sync-ldap --groups ambari-server sync-ldap --users. when I use ambari-server sync-ldap -all, I am getting the error. Thanks, Sankar.
... View more
04-24-2018
04:58 PM
I am having the same issue with broken links. Any idea why the links will break? Its happening every week.
... View more
04-19-2018
09:37 PM
Hi, I am getting the below error when I am trying to do ldap sync. Please help me to fix the issue. $ sudo ambari-server sync-ldap --all
Using python /usr/bin/python
Syncing with LDAP...
Enter Ambari Admin login: admin
Enter Ambari Admin password:
Syncing all............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ERROR: Exiting with exit code 1.
REASON: Caught exception running LDAP sync. XXXXXXXX:636; socket closed; nested exception is javax.naming.ServiceUnavailableException: XXXXXXXX:636; socket closed Ambari-server log: ERROR [pool-18-thread-2] LdapSyncEventResourceProvider:460 - Caught exception running LDAP sync.
org.springframework.ldap.ServiceUnavailableException: XXXXXXXX:636; socket closed; nested exception is javax.naming.ServiceUnavailableException: XXXXXXXX:636; socket closed
at org.springframework.ldap.support.LdapUtils.convertLdapException(LdapUtils.java:223)
at org.springframework.ldap.core.support.AbstractContextSource.createContext(AbstractContextSource.java:356)
at org.springframework.ldap.core.support.AbstractContextSource.doGetContext(AbstractContextSource.java:140)
at org.springframework.ldap.core.support.AbstractContextSource.getReadOnlyContext(AbstractContextSource.java:159)
at org.springframework.ldap.core.LdapTemplate.search(LdapTemplate.java:357)
at org.springframework.ldap.core.LdapTemplate.search(LdapTemplate.java:309)
at org.springframework.ldap.core.LdapTemplate.search(LdapTemplate.java:642)
at org.springframework.ldap.core.LdapTemplate.search(LdapTemplate.java:578)
at org.apache.ambari.server.security.ldap.AmbariLdapDataPopulator.getFilteredLdapUsers(AmbariLdapDataPopulator.java:667)
at org.apache.ambari.server.security.ldap.AmbariLdapDataPopulator.getLdapUserByMemberAttr(AmbariLdapDataPopulator.java:480)
at org.apache.ambari.server.security.ldap.AmbariLdapDataPopulator.refreshGroupMembers(AmbariLdapDataPopulator.java:381)
at org.apache.ambari.server.security.ldap.AmbariLdapDataPopulator.synchronizeAllLdapGroups(AmbariLdapDataPopulator.java:194)
at org.apache.ambari.server.controller.AmbariManagementControllerImpl.synchronizeLdapUsersAndGroups(AmbariManagementControllerImpl.java:5191)
at org.apache.ambari.server.controller.internal.LdapSyncEventResourceProvider.syncLdap(LdapSyncEventResourceProvider.java:490)
at org.apache.ambari.server.controller.internal.LdapSyncEventResourceProvider.processSyncEvents(LdapSyncEventResourceProvider.java:448)
at org.apache.ambari.server.controller.internal.LdapSyncEventResourceProvider.access$000(LdapSyncEventResourceProvider.java:65)
at org.apache.ambari.server.controller.internal.LdapSyncEventResourceProvider$1.run(LdapSyncEventResourceProvider.java:259)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: javax.naming.ServiceUnavailableException: XXXXXXXX:636; socket closed
at com.sun.jndi.ldap.Connection.readReply(Connection.java:454)
at com.sun.jndi.ldap.LdapClient.ldapBind(LdapClient.java:365)
at com.sun.jndi.ldap.LdapClient.authenticate(LdapClient.java:214)
at com.sun.jndi.ldap.LdapCtx.connect(LdapCtx.java:2791)
at com.sun.jndi.ldap.LdapCtx.<init>(LdapCtx.java:319)
... View more
Labels:
- Labels:
-
Apache Ambari
04-19-2018
08:05 PM
Hi Experts, I have the same issue and I don't have these parameters set in ambari.properties file. recovery.enabled_components=METRICS_COLLECTOR recovery.type=AUTO_START. But after sometime the metrics server will start by itself.
... View more
03-29-2018
04:37 PM
I finished the ambari-server setup-ldap successfully. I ran the ambari-server sync-ldap --all and got the below error. Using python /usr/bin/python Syncing with LDAP... Enter Ambari Admin login: admin Enter Ambari Admin password: Syncing all...ERROR: Exiting with exit code 1. REASON: Caught exception running LDAP sync. simple bind failed: Secondary LDAPS URL:636; nested exception is javax.naming.CommunicationException: simple bind failed:LDAPS server URL:636 [Root exception is javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
... View more
- Tags:
- Hadoop Core
- hdp2.6.4
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
02-02-2018
09:15 PM
@slachterman
you want anymore information from me to resolve the issue as I am struck here and not able to move with the other policy
... View more
02-02-2018
06:10 PM
@Carlton Patterson If you are using ambari views, you can use hive view and after executing your query, you can export the results to csv file on your local sytem or to HDFS. ambari-hive-view.png
... View more
02-02-2018
05:42 PM
You need to restart the KDC server and also the KMS server to get the latest configuarion changes.
... View more
01-30-2018
09:25 PM
Can you please provide me the solution?
... View more
01-29-2018
10:19 PM
@Sami Ahmad Check these parameters in HDFS configuration and make the changes as suggested.
... View more
01-26-2018
05:29 PM
@slachterman I did that. But for security purpose, I didn't put the server name in the post. I worked with API in earlier project and we were able to do the similar thing for kerberos client and worked fine there. But here I am getting the error.
... View more
01-25-2018
11:14 PM
Were you able to complete the rest of the steps?
... View more
01-25-2018
10:38 PM
@slachterman I followed the steps above stated and I was not successful in creating dynamic context. Below are the steps. 1. I created the json file with the "contextEnrichers" and "policyConditions" in a json file (validated with json validator tool) 2. ran the curl command "curl -v -H 'Content-Type: application/json'-u admin:admin -X PUT --data hiveService2.json http://<FQDN>:6080/service/public/v2/api/servicedef/name/hive" and got the below output. * Server auth using Basic with user 'admin'
> PUT /service/public/v2/api/servicedef/name/hive HTTP/1.1
> Authorization: Basic YWRtaW46YWRtaW4=
> User-Agent: curl/7.29.0
> Host: FQDN:6080
> Accept: */*
> Content-Type: application/json
> Content-Length: 17
>
* upload completely sent off: 17 out of 17 bytes
< HTTP/1.1 404 Not Found
< Server: Apache-Coyote/1.1
< Set-Cookie: RANGERADMINSESSIONID=26D6411248523A3341C7680AABC8A68A; Path=/; HttpOnly
< X-Frame-Options: DENY
< Content-Length: 0
< Date: Thu, 25 Jan 2018 22:24:57 GMT
<
* Connection #0 to host <FQDN> left intact Let me know where I went wrong. am using HDP 2.6.3
... View more
01-24-2018
10:02 PM
Can you redeploy the HA and see if there were any steps that you missed during the HA enabling process. Please follow the steps suggested by Hortonworks. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_hadoop-high-availability/content/ch_HA-NameNode.html
... View more
01-24-2018
09:34 PM
try to restart the ambari metrics service and check the error goes off
... View more
01-24-2018
08:35 PM
@ n c Try this and see if it works delete the below configuration from hbase-site:
- hbase.bucketcache.ioengine
- hbase.bucketcache.percentage.in.combinedcache
- hbase.bucketcache.size
... View more
01-24-2018
08:30 PM
@Adil Muganlinsky can you verify in /etc/yum.repos.d, you only have the current versions of repo files. If not clean up the directory and keep the current version of ambari and HDP repos and try.
... View more
01-24-2018
08:21 PM
@Michael Bronson can you share the log file? It may tell you the root cause.
... View more
01-24-2018
07:29 PM
@laki cheli after logging as root you su to hdfs (su hdfs) and run the rest of the commands to create the directories.
... View more
01-24-2018
07:23 PM
@Kuber S Can you share the Advanced spark2-thrift-sparkconf configurations?
... View more
01-19-2018
07:14 PM
@Ajay I have the similar issue. whats the change that you made to made it work.
... View more