Member since
01-22-2016
26
Posts
10
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3272 | 12-09-2016 08:51 AM | |
3017 | 09-14-2016 07:23 AM |
12-09-2016
08:51 AM
@Karan Alang Yes I made it working, the first thing is that it will not work directly from Ambari atleast in version 2.1.1 because it does not provide you any support for securing it with Kerberos. I fixed the issue by using the latest version of Opentsdb from githhub, then manually installing it, then interestingly it started working. In the older version of opentsdb the support with kerberos is not fully enabled. You can download the opentsdb from here: https://github.com/OpenTSDB/opentsdb/releases/tag/v2.3.0RC2
... View more
11-04-2016
10:05 AM
@hu bai If you installed the ranger plugin from ambari you need to make sure that you configure the following two properties correctly in the section: "Advanced ranger-storm-plugin-properties" REPOSITORY_CONFIG_PASSWORD: <password for ranger repository user, you can create it yourself its a good practice> Ranger repository config user <ranger repository user name > policy User for STORM <storm>
Then you do not need to configure anything from Ranger GUI itself, more importantly run the storm topology and check whether ranger is really doing its work or not, you can check it from the audit logs.
... View more
09-14-2016
07:23 AM
Hello All, Thanks all for your help, I managed to resolve the issue by doing the following: 1) I realized that in the ambari-agent logs I have this error message {'msg':'Unable to read structured output from /var/lib/ambari-agent/data/structured-out-status.json'} 2) For resolving this first stop ambari agent 3) move the move /var/lib/ambari-agent/data/structured-out-status.json to /tmp. 4) Restart the ambari-agent. 5) Everything is green again. Actually I followed a post mentioned here: https://community.hortonworks.com/questions/16953/unable-to-start-hdp-serrvices-from-ambari.html Don't know the exact reason behind this, but i was able to fix the issue. Thanks again for your help.
... View more
09-09-2016
10:18 AM
2 Kudos
Hi Community, We have seen an unusual behavior on our Hortonworks cluster, where on one of the Host Ambari shows that all the services are down, but in real the services are running properly, this happened even after running the services from Ambari. Following has been checked. 1) Verified the status of services from logs e.g. /var/hadoop/hdfs/hadoop-namenode .... (Shows running) 2) Checked the PID of services from /var/run/<servicename>/<servicename.pid> (The new PID was present). 3) Ambari agents and ambari server was stopped and started as well, but it didn't help. Is there any way to fix this issue? If you need more information, please do let me know. HDP Version: 2.3.4 Ambari Version: 2.1.1 Thanks in Advance. Cheers ! Hammad
... View more
Labels:
- Labels:
-
Apache Ambari
06-30-2016
12:55 PM
1 Kudo
Hi All, Does anybody has documentation for installing Opentsdb on Hortonworks platform with Kerberos enabled? There is an example here to install opentsdb with Hortonworks Sandbox here: https://community.hortonworks.com/articles/4577/use-opentsdb-to-storevisualize-stock-data-on-hdp-s.html It does not seem to work with Kerberos, however I am able to install Opentsdb manually and made the required settings for Kerberos in the configuration file, but I am still having problems while starting the service. Somehow it is not able to read hbase-meta-region and throws following errors. 12:41:48.349 DEBUG [ClientCnxnSocketNIO.findSendablePacket] - deferring non-priming packet: clientPath:null serverPath:null finished:false header:: 0,-11 replyHeader:: 0,0,0 request:: null response:: nulluntil SASL authentication completes.
12:41:48.356 DEBUG [ClientCnxn.readResponse] - Reading reply sessionid:0x1559dca73430bca, packet:: clientPath:/hbase-secure/root-region-server serverPath:/hbase-secure/root-region-server finished:false header:: 3,4 replyHeader:: 3,240518300453,-101 request:: '/hbase-secure/root-region-server,T response::
12:41:48.356 DEBUG [ClientCnxn.readResponse] - Reading reply sessionid:0x1559dca73430bca, packet:: clientPath:/hbase-secure/meta-region-server serverPath:/hbase-secure/meta-region-server finished:false header:: 4,4 replyHeader:: 4,240518300453,0 request:: '/hbase-secure/meta-region-server,T response:: #ffffffff0001a726567696f6e7365727665723a3136303230ffffff87ffffffab61b48ffffff8827ffffff9a50425546a40a3464657370702d7368726b2d77322e6e6f64652e65752d776573742e6465762d657370702e696e7472616e65742e656f6e2e636f6d10ffffff947d18ffffffa8ffffffa9ffffffd9ffffffc8ffffffd72a100183,s{227633711798,227633714655,1466615355376,1466615687830,1,0,0,0,105,0,227633711798}
- Unexpected exception from downstream on [id: 0x0af73a5a, /10.66.48.100:55821 => /10.66.48.102:16020]
12:43:51.593 ERROR [RegionClient.decodeLast] - After decoding the last message on [id: 0x0af73a5a, /10.66.48.100:55821 :> /10.66.48.102:16020], there was still some undecoded bytes in the channel's buffer (which are going to be lost): If somebody else has faced the same error, please guide me how it should be resolved? Thanks in Advance.
... View more
Labels:
- Labels:
-
Apache HBase
02-09-2016
07:46 AM
1 Kudo
@Ali Bajwa Thanks for this, deletion of service from Ambari database worked for me, after that i restarted Ambari and i was able to install new fresh version of Knox again.
... View more
02-07-2016
10:41 AM
@Neeraj Sabharwal Thanks for your answer, I deleted the Knox service with the above mentioned command, which i tried many time before too, when i checked on ambari the service is gone, then I restarted the ambari server and I found the this exception in the logs: Local Exception Stack: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DatabaseException Internal Exception: org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "clusterservices_pkey" Detail: Key (service_name, cluster_id)=(KNOX, 2) already exists. Moreover I can see the knox service with yellow sign once again in the ambari, my assumption is that the service is not being deleted from ambari database properly due to which I am also unable to install it once again. Any feedback or advice, what is happening actually and how to proceed (we might need to delete it from database)? Thanks in advance. Hammad Ali
... View more
02-06-2016
09:06 AM
1 Kudo
Hi Guys, We setup a new HDP cluster with Ambari version 2.1, with HDP version 2.3.4 and Knox version 0.6, after successful installation of knox for the first time, we tried to configure it with LDAP and tried to restart the knox gateway service, which didn't work and it seemed to be a bug which will be resolved in the next version of Ambari, then we thought to Delete Knox service via REST API and install it once again with Ambari, which is not working now as at the end of setup from Ambari we only get and error message "server error" from Ambari. Any ideas or suggestions that what should be the next step? Thanks in advance. Regards, Hammad Ali
... View more
Labels:
- Labels:
-
Apache Knox
-
Cloudera DataFlow (CDF)
02-06-2016
08:58 AM
2 Kudos
I checked this on my sandbox cluster and it worked, now we are setting up a new cluster with HDP 2.3.4 so i guess it should be work there too, thanks for help.
... View more