Support Questions

Find answers, ask questions, and share your expertise

Submitting a jar to a cloudbreak created cluster

avatar
Contributor

I have a MapReduce2 jar that I need to run on a cluster we're currently spinning up via Cloudbreak.

Previously the machine we run the jar from would have had all the hadoop client libraries and config files on so we could just run "hadoop jar jarfile.jar" (via a scheduling framework) and we'd be good to go. Now this won't work as some values (particularly hostnames/IP's) may change.

What's the recommended method for submitting work to the cluster in this situation from an external client?

Are there api endpoints where we can pull all the necessary configs once the cluster is spun up? or is the an API we can directly submit the jar to?

Thanks

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hi,

How do you mean 'as some values (particularly hostnames/IP's) may change' ?

If you want to get the service configs. I suggest to use the Ambari API:

http://<ip-address>:8080/api/v1/clusters/testcluster/configurations here are the service config endpoints.

If you are using the urls which are in the response then you will get the configs.

Br,

R

View solution in original post

2 REPLIES 2

avatar
Super Collaborator

Hi,

How do you mean 'as some values (particularly hostnames/IP's) may change' ?

If you want to get the service configs. I suggest to use the Ambari API:

http://<ip-address>:8080/api/v1/clusters/testcluster/configurations here are the service config endpoints.

If you are using the urls which are in the response then you will get the configs.

Br,

R

avatar
Super Collaborator

You can download the configs from Ambari and if you have the client libraries locally installed you can use those configs with the public IPs. Also you could ssh to one of the instances to submit the jobs from there.