Member since
12-06-2017
12
Posts
0
Kudos Received
0
Solutions
09-25-2018
05:06 PM
Hi @Sandeep Nemuri, Below is the script I am working on and its working fine when I am running the same on ambari host but it's started giving an error when the same running on its nodes because that time localhost gets changed. Can you please suggest me what to use to get the ambari host value when I will erun the same script on different nodes? #!/bin/bash AMBARI_HOST=localhost echo "AMBARI_HOST: "$AMBARI_HOST HOST_NAME=$(hostname --fqdn) echo "HOST_NAME: "HOST_NAME AMBARI_PASSWORD=xxxxxxxxxxxx #detect name of cluster # CLUSTER_OUTPUT=$(curl -u admin:$AMBARI_PASSWORD -i -H 'X-Requested-By: ambari' http://$AMBARI_HOST:8080/api/v1/clusters) export CLUSTER=$(echo $CLUSTER_OUTPUT | sed -n 's/.*"cluster_name" : "\([^\"]*\)".*/\1/p') echo $CLUSTER curl -iv -u admin:$AMBARI_PASSWORD -H "X-Requested-By: ambari" -X PUT -d '{"RequestInfo":{"context":"Stopping All Host Components","operation_level": {"level":"HOST","cluster_name":"autoshutdown", "host_names":"$HOST_NAME"},"query":"HostRoles/component_name/*"}, "Body": {"HostRoles": {"state":"INSTALLED"}}}' http://$AMBARI_HOST:8080/api/v1/clusters/autoshutdown/hosts/$HOST_NAME/host_components
... View more
09-24-2018
06:08 PM
Hi, Thanks for the help, but how it will work for multiple clusters and nodes. Do I need to hardcode this script for each and individual node or Is there any other way so it fetches the info like cluster_name, href? @Sandeep Nemuri
... View more
09-22-2018
03:34 PM
Hi, We hosted multiple Hadoop clusters and VM on Azure, and as per the requirement needs to create a script to start and stop the Hadoop clusters and other VM in a sequence as mentioned below: 1. stop all the hadoop services--> stop cloudbreak -> stop all VM 2. Start all VM --> start cloudbreak --> start all hadoop services (for hadoop don't know the sequence of services) Any suggestion on how to implement this? @Jay Kumar SenSharma
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak
09-12-2018
11:27 PM
Hi @rkovacs, Thanks. Please find the output: HTTP/1.1 200 OK
Content-Type: binary/octet-stream
Content-Length: 12026642432
Connection: keep-alive
Date: Wed, 12 Sep 2018 14:32:10 GMT
Last-Modified: Wed, 11 Jul 2018 12:15:23 GMT
ETag: "d863b6fd5098be37a55b2597d76fef20-1434"
Accept-Ranges: bytes
Server: AmazonS3
X-Cache: Miss from cloudfront
Via: 1.1 4a55d86b7263f73c6817c7c25d4b3643.cloudfront.net (CloudFront)
X-Amz-Cf-Id: RIjSH7neV7s5aVhEkETsg4QCzkDubjg-V7WZzR521gkCO2dk4nyUoQ==
... View more
09-06-2018
06:19 PM
I installed Openstack (queen) on single node (RHEL7.4) and cloudbreak on another node. Also created the credentials in cloudbreak but still not able to spin up the cluster through cloudbreak as getting an error: image copy failed. @rkovacs
... View more
09-06-2018
06:17 PM
Hi, I already installed Openstack queen version on single node VM (RHEL 7.4) and cloudbreak 2.7.1 on another VM. I created the credentials in cloudbreak but still not able to spin up the cluster through cloudbreak as getting an error: image copy failed
... View more
09-04-2018
07:10 PM
Getting issue with cloudbreak and openstack configuration, followed the hortonworks document but still getting the issue. Need help and guidance on how to launch cluster through cloudbreak using OpenStack? @khorvath @mmolnar @bbihari @Mathieu Perochon @Janos Matyas @rdoktorics
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak
08-31-2018
06:05 PM
Thanks @Gulshad Ansari. Yes we can store the file in all the storage type but want to ensure can we use them for HDFS file system, as per my understanding Hortonworks didn't recommend that. Please correct me.
... View more
08-29-2018
05:27 PM
I looking for the option to create my HDFS file system over Azure, I checked it out...WASB is not the option but not sure about ADLS gen2 or any other option in Azure. Need suggestion on the same.
... View more
Labels:
- Labels:
-
Apache Hadoop
08-29-2018
05:05 AM
I installed openstack(queen version) and cloudbreak on 2 different VM. As per the hortonworks document created credentials as well, but while launching the cluster getting this error: image copy failed. I followed this URL as it is to installed Openstack : https://www.tecmint.com/openstack-installation-guide-rhel-centos/ After reviewing above URL please do let me know what all the steps I am missing or what is the right way to create Private cloud to launch HDP cluster?
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak
03-16-2018
08:28 PM
following this doc but still not able to catalog AWS image https://docs.hortonworks.com/HDPDocuments/Cloudbreak/Cloudbreak-2.4.0/content/images/index.html#custom-images
... View more
Labels:
- Labels:
-
Hortonworks Cloudbreak