Support Questions

Find answers, ask questions, and share your expertise

Is there any way to pass parameters to a custom script in ambari ?

avatar

I'm working on a POC to take NN metadata backup. For this, i'm compressing the NN's metadata using a custom script that i will call through ambari API. (https://community.hortonworks.com/articles/139788/running-custom-scripts-through-ambari.html) . If i simply hardcode the datadirectory name it will work. However, i'm looking to pass them as an argument while submitting the request. Anyone aware of the method ?

1 ACCEPTED SOLUTION

avatar

Hi @Narendra Neerukonda,


As the Article Says :

Note: You can add comma separated inputs if you need any inputs for the script.

you can actually pass the parameters to the script like this :

curl -u <username>:<password> -X POST -H 'X-Requested-By:ambari' -d'{"RequestInfo":{"context":"Execute my action", "action":"my_action", "parameters" : {"my_input" : "value"}}, "Requests/resource_filters":[{"service_name":"", "component_name":"", "hosts":"<comma_separated_host_names>"}]' http://<ambari_host>:<port>/api/v1/clusters/<cluster_name>/requests

Where "my_input" is paramter name and "value" is the value to be passed.

Hope this helps. Please accept answer and vote up if it did.


View solution in original post

4 REPLIES 4

avatar

Hi @Narendra Neerukonda,


As the Article Says :

Note: You can add comma separated inputs if you need any inputs for the script.

you can actually pass the parameters to the script like this :

curl -u <username>:<password> -X POST -H 'X-Requested-By:ambari' -d'{"RequestInfo":{"context":"Execute my action", "action":"my_action", "parameters" : {"my_input" : "value"}}, "Requests/resource_filters":[{"service_name":"", "component_name":"", "hosts":"<comma_separated_host_names>"}]' http://<ambari_host>:<port>/api/v1/clusters/<cluster_name>/requests

Where "my_input" is paramter name and "value" is the value to be passed.

Hope this helps. Please accept answer and vote up if it did.


avatar

My previous comment is still under moderation but the solution is: Apparently, it's mandatory to pass value for service_name and component_name.

avatar

Thanks @Akhil S Naik ,

I'm able to pass the parameters now.

However, there is a problem i'm facing when i'm submitting the request with "Requests/resource_filters".

If i do the below, the custom script is executed but, it gets executed on all hosts instead of only server1 and server2:

-d '{"RequestInfo":{"context":"Execute an action", "action" : "stream_file", "parameters" : {"file" : "/hadoop_mount/hdfs.tar.gz"},"service_name" : "", "component_name":"", "hosts":"server1,server2"}}' http://ambari-server:8080/api/v1/clusters/<clustername>/requests

To resolve this i added below as per from your above sample:

-d '{"RequestInfo":{"context":"Execute an action", "action" : "stream_file", "parameters" : {"file" : "/hadoop01/hdfs.tar.gz"}},"Requests/resource_filters":[{"service_name" : "", "component_name":"", "hosts":"server1,server2"}]}' http://ambari-server:8080/api/v1/clusters/<clustername>/requests

But when i submit this, even though the command goes to only the required hosts, it just goes into hung state (grey colored gears - Execution doesn't start at all.) Any idea on what might be blocking the execution ?

With Regards,

Narendra


To the users who will come here in future, the parameter you passed will be available from the configs (config = Script.get_config()) inside your code and at config['roleParams']['my_input'] to be precise.

avatar

The above question and the entire response thread below was originally posted in the Community Help track. On Sat Jul 6 17:30 UTC 2019, a member of the HCC moderation staff moved it to the Cloud & Operations track. The Community Help Track is intended for questions about using the HCC site itself, not technical questions.

Bill Brooks, Community Moderator
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.