- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
How to add component SPARK_JOBHISTORYSERVER to SPARK service, which already installed
- Labels:
-
Apache Ambari
-
Apache Spark
Created ‎02-10-2017 11:01 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
My original goal is move the component to another node.
So I just did as Panwar said in https://community.hortonworks.com/questions/4272/process-for-moving-hdp-services-manually.html.I did not call API in order like Panwar tips:
SPARK>:8080/api/v1/clusters/clustername/services/SPARK curl -u admin:admin -H "X-Requested-By: ambari" -X PUT -d '{"RequestInfo":{"context":"Stop Component"},"Body":{"HostRoles":{"state":"INSTALLED"}}}' http://:8080/api/v1/clusters/clustername/services/SPARK>:8080/api/v1/clusters/clustername/hosts/<old... curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://:8080/api/v1/clusters/clustername/services/SPARK>:8080/api/v1/clusters/clustername/services/S...
I delete the component when it's still running:
curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://namenode01.will.com:8080/api/v1/clusters/datacenter/services/SPARK/components/SPARK_JOBHISTOR...
then, the component disappeared from the stack. I cannot find it from http://namenode01.will.com:8080/api/v1/clusters/datacenter/services/SPARK/components now:
{ "href" : "http://namenode01.will.com:8080/api/v1/clusters/datacenter/services/SPARK/components", "items" : [ { "href" : "http://namenode01.will.com:8080/api/v1/clusters/datacenter/services/SPARK/components/SPARK_CLIENT", "ServiceComponentInfo" : { "cluster_name" : "datacenter", "component_name" : "SPARK_CLIENT", "service_name" : "SPARK" } }, { "href" : "http://namenode01.will.com:8080/api/v1/clusters/datacenter/services/SPARK/components/SPARK_THRIFTSERVER", "ServiceComponentInfo" : { "cluster_name" : "datacenter", "component_name" : "SPARK_THRIFTSERVER", "service_name" : "SPARK" } } ] }
It also disappeared from the UI.
So, anybody help me, please?
Created ‎02-10-2017 01:05 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Here are steps to move SHS from one node to another. Replace admin:admpw with your user-name:password. Run on the Ambari server, or replace localhost with your Ambari server FQDN. Replace "mycluster" with your cluster name. Replace host2.example.com with your target node FQDN.
- Delete SHS component (it looks like you already did this so you can skip this step)
$ curl -i -H "X-Requested-By:ambari" -u admin:admpw -X DELETE http://localhost:8080/api/v1/clusters/mycluster/services/SPARK/components/SPARK_JOBHISTORYSERVER
- Add SHS again
$ curl -i -H "X-Requested-By:ambari" -u admin:admpw -X POST http://localhost:8080/api/v1/clusters/mycluster/services/SPARK/components/SPARK_JOBHISTORYSERVER
- Install host component on the new node: host2.example.com
$ curl -i -u admin:admpw -H "X-Requested-By:ambari" -i -X POST -d '{"host_components" : [{"HostRoles":{"component_name":"SPARK_JOBHISTORYSERVER"}}] }' http://localhost:8080/api/v1/clusters/mycluster/hosts?Hosts/host_name=host2.example.com
- In Ambari, Hosts --> host2.example.com, click on Spark History Server, and then "Reinstall" button
- In Ambari-->Spark, start Spark service, and run Spark service check
Created ‎02-10-2017 11:09 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
another question: I cannot find SPARK_THRIFTSERVER in /var/lib/ambari-server/resources/common-services/SPARK/1.2.0.2.2/metainfo.xml. Why does it exists in SPARK service?
Created ‎02-10-2017 11:38 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
you can add spark johistoryserver back by using ambari-web UI:
- Go to the host details page. This can be done by clicking on the hosts tab and clicking on the host where you want to install spark jobhistoryserver
- Click on Add button. It should show list of components that can be added to the host.
- Click on spark jobhistoryserver. please see the attached image add-component.png
If for any reason you are not able to add it from UI or you want to use ambari-server REST APIs to do so then article at link has a section "Step 5 - Create host components" which can be used to add host components.
another question: I cannot find SPARK_THRIFTSERVER in /var/lib/ambari-server/resources/common-services/SPARK/1.2.0.2.2/metainfo.xml. Why does it exists in SPARK service?
Thriftserver support for spark was added first in spark 1.4.1 definition in ambari (link). It will be defined at /var/lib/ambari-server/resources/common-services/SPARK/1.4.1/metainfo.xml.
Created ‎02-10-2017 01:05 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Here are steps to move SHS from one node to another. Replace admin:admpw with your user-name:password. Run on the Ambari server, or replace localhost with your Ambari server FQDN. Replace "mycluster" with your cluster name. Replace host2.example.com with your target node FQDN.
- Delete SHS component (it looks like you already did this so you can skip this step)
$ curl -i -H "X-Requested-By:ambari" -u admin:admpw -X DELETE http://localhost:8080/api/v1/clusters/mycluster/services/SPARK/components/SPARK_JOBHISTORYSERVER
- Add SHS again
$ curl -i -H "X-Requested-By:ambari" -u admin:admpw -X POST http://localhost:8080/api/v1/clusters/mycluster/services/SPARK/components/SPARK_JOBHISTORYSERVER
- Install host component on the new node: host2.example.com
$ curl -i -u admin:admpw -H "X-Requested-By:ambari" -i -X POST -d '{"host_components" : [{"HostRoles":{"component_name":"SPARK_JOBHISTORYSERVER"}}] }' http://localhost:8080/api/v1/clusters/mycluster/hosts?Hosts/host_name=host2.example.com
- In Ambari, Hosts --> host2.example.com, click on Spark History Server, and then "Reinstall" button
- In Ambari-->Spark, start Spark service, and run Spark service check
