Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

rest api to fetch config files like hdfs-site.xml, yarn-site.xml etc

avatar

I want to fetch all hadoop config files with all configuration ? for e.g hdfs, hive, hbase , yarn etc conif files

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Anurag Mishra

You can try the following API call. I tested it on Ambari 2.5.1 which provides this option to download configs.

# mkdir /tmp/All_Configs
# cd /tmp/All_Configs/

# curl -iv -u admin:admin -H "X-Requested-By: ambari" -X GET  http://localhost:8080/api/v1/clusters/Sandbox/components?format=client_config_tar   -o /tmp/All_Configs/cluster_configs.tar.gz

.

Now you can extract the "/tmp/All_Configs/cluster_configs.tar.gz" file and then you can find most of the configs.

I found the following configs inside the mentioned gz file:

Downloads/Sandbox\(CLUSTER\)-configs 

| | | |____Sandbox(CLUSTER)-configs
| | | | |____.DS_Store
| | | | |____ATLAS_CLIENT
| | | | | |____application.properties
| | | | | |____atlas-env.sh
| | | | | |____atlas-log4j.xml
| | | | | |____atlas-solrconfig.xml
| | | | |____FALCON_CLIENT
| | | | | |____falcon-env.sh
| | | | | |____runtime.properties
| | | | | |____startup.properties
| | | | |____HBASE_CLIENT
| | | | | |____hbase-env.sh
| | | | | |____hbase-policy.xml
| | | | | |____hbase-site.xml
| | | | | |____log4j.properties
| | | | |____HDFS_CLIENT
| | | | | |____core-site.xml
| | | | | |____hadoop-env.sh
| | | | | |____hdfs-site.xml
| | | | | |____log4j.properties
| | | | |____HIVE_CLIENT
| | | | | |____hive-env.sh
| | | | | |____hive-exec-log4j.properties
| | | | | |____hive-log4j.properties
| | | | | |____hive-site.xml
| | | | |____INFRA_SOLR_CLIENT
| | | | | |____log4j.properties
| | | | |____MAPREDUCE2_CLIENT
| | | | | |____core-site.xml
| | | | | |____mapred-env.sh
| | | | | |____mapred-site.xml
| | | | |____OOZIE_CLIENT
| | | | | |____oozie-env.sh
| | | | | |____oozie-log4j.properties
| | | | | |____oozie-site.xml
| | | | |____PIG
| | | | | |____log4j.properties
| | | | | |____pig-env.sh
| | | | | |____pig.properties
| | | | |____SLIDER
| | | | | |____core-site.xml
| | | | | |____hdfs-site.xml
| | | | | |____log4j.properties
| | | | | |____slider-client.xml
| | | | | |____slider-env.sh
| | | | | |____yarn-site.xml
| | | | |____SPARK2_CLIENT
| | | | | |____spark-defaults.conf
| | | | | |____spark-env.sh
| | | | | |____spark-log4j.properties
| | | | | |____spark-metrics.properties
| | | | |____SPARK_CLIENT
| | | | | |____spark-defaults.conf
| | | | | |____spark-env.sh
| | | | | |____spark-log4j.properties
| | | | | |____spark-metrics.properties
| | | | |____SQOOP
| | | | | |____sqoop-env.sh
| | | | | |____sqoop-site.xml
| | | | |____TEZ_CLIENT
| | | | | |____tez-env.sh
| | | | | |____tez-site.xml
| | | | |____YARN_CLIENT
| | | | | |____capacity-scheduler.xml
| | | | | |____core-site.xml
| | | | | |____log4j.properties
| | | | | |____yarn-env.sh
| | | | | |____yarn-site.xml
| | | | |____ZOOKEEPER_CLIENT
| | | | | |____log4j.properties
| | | | | |____zookeeper-env.sh

.

View solution in original post

13 REPLIES 13

avatar
Super Guru

Have you looked at Ambari Blueprints?

https://cwiki.apache.org/confluence/display/AMBARI/Blueprints

curl -H "X-Requested-By: ambari" -X GET -u ambariuser:ambaripassword "http://YOUR_AMBARI_SERVER:8080/api/v1/clusters/YOUR_CLUSTER_NAME?format=blueprint" > blueprint.json

All of the configuration information is contained in that JSON file which you can then parse/manipulate to re-construct XML files.

avatar
Master Mentor

@Anurag Mishra

You can try the following API call. I tested it on Ambari 2.5.1 which provides this option to download configs.

# mkdir /tmp/All_Configs
# cd /tmp/All_Configs/

# curl -iv -u admin:admin -H "X-Requested-By: ambari" -X GET  http://localhost:8080/api/v1/clusters/Sandbox/components?format=client_config_tar   -o /tmp/All_Configs/cluster_configs.tar.gz

.

Now you can extract the "/tmp/All_Configs/cluster_configs.tar.gz" file and then you can find most of the configs.

I found the following configs inside the mentioned gz file:

Downloads/Sandbox\(CLUSTER\)-configs 

| | | |____Sandbox(CLUSTER)-configs
| | | | |____.DS_Store
| | | | |____ATLAS_CLIENT
| | | | | |____application.properties
| | | | | |____atlas-env.sh
| | | | | |____atlas-log4j.xml
| | | | | |____atlas-solrconfig.xml
| | | | |____FALCON_CLIENT
| | | | | |____falcon-env.sh
| | | | | |____runtime.properties
| | | | | |____startup.properties
| | | | |____HBASE_CLIENT
| | | | | |____hbase-env.sh
| | | | | |____hbase-policy.xml
| | | | | |____hbase-site.xml
| | | | | |____log4j.properties
| | | | |____HDFS_CLIENT
| | | | | |____core-site.xml
| | | | | |____hadoop-env.sh
| | | | | |____hdfs-site.xml
| | | | | |____log4j.properties
| | | | |____HIVE_CLIENT
| | | | | |____hive-env.sh
| | | | | |____hive-exec-log4j.properties
| | | | | |____hive-log4j.properties
| | | | | |____hive-site.xml
| | | | |____INFRA_SOLR_CLIENT
| | | | | |____log4j.properties
| | | | |____MAPREDUCE2_CLIENT
| | | | | |____core-site.xml
| | | | | |____mapred-env.sh
| | | | | |____mapred-site.xml
| | | | |____OOZIE_CLIENT
| | | | | |____oozie-env.sh
| | | | | |____oozie-log4j.properties
| | | | | |____oozie-site.xml
| | | | |____PIG
| | | | | |____log4j.properties
| | | | | |____pig-env.sh
| | | | | |____pig.properties
| | | | |____SLIDER
| | | | | |____core-site.xml
| | | | | |____hdfs-site.xml
| | | | | |____log4j.properties
| | | | | |____slider-client.xml
| | | | | |____slider-env.sh
| | | | | |____yarn-site.xml
| | | | |____SPARK2_CLIENT
| | | | | |____spark-defaults.conf
| | | | | |____spark-env.sh
| | | | | |____spark-log4j.properties
| | | | | |____spark-metrics.properties
| | | | |____SPARK_CLIENT
| | | | | |____spark-defaults.conf
| | | | | |____spark-env.sh
| | | | | |____spark-log4j.properties
| | | | | |____spark-metrics.properties
| | | | |____SQOOP
| | | | | |____sqoop-env.sh
| | | | | |____sqoop-site.xml
| | | | |____TEZ_CLIENT
| | | | | |____tez-env.sh
| | | | | |____tez-site.xml
| | | | |____YARN_CLIENT
| | | | | |____capacity-scheduler.xml
| | | | | |____core-site.xml
| | | | | |____log4j.properties
| | | | | |____yarn-env.sh
| | | | | |____yarn-site.xml
| | | | |____ZOOKEEPER_CLIENT
| | | | | |____log4j.properties
| | | | | |____zookeeper-env.sh

.

avatar

@Jay SenSharma while extracting I am getting error :

tar: This does not look like a tar archive

gzip: stdin: not in gzip format

tar: Child returned status 1

tar: Error is not recoverable: exiting now

avatar
Master Mentor

@Anurag Mishra

Please do not use the "-iv" option with curl command .. as it will cause additional junk data to be added to the tar.gz file.

Please try the following command:

# curl -u admin:admin -H "X-Requested-By: ambari" -X GET  http://localhost:8080/api/v1/clusters/Sandbox/components?format=client_config_tar   -o /tmp/All_Configs/cluster_configs.tar.g

Output:

[root@sandbox All_Configs]# curl -u admin:admin -H "X-Requested-By: ambari" -X GET  http://localhost:8080/api/v1/clusters/Sandbox/components?format=client_config_tar   -o /tmp/All_Configs/cluster_configs.tar.gz
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 48887    0 48887    0     0  21626      0 --:--:--  0:00:02 --:--:-- 21631


[root@sandbox All_Configs]# ls
cluster_configs.tar.gz


[root@sandbox All_Configs]# tar -xvf cluster_configs.tar.gz
SPARK2_CLIENT/./
SPARK2_CLIENT/spark-defaults.conf
SPARK2_CLIENT/spark-env.sh
SPARK2_CLIENT/spark-log4j.properties
SPARK2_CLIENT/spark-metrics.properties
TEZ_CLIENT/./
TEZ_CLIENT/tez-site.xml
TEZ_CLIENT/tez-env.sh
SLIDER/./
SLIDER/slider-client.xml
SLIDER/hdfs-site.xml
SLIDER/yarn-site.xml
SLIDER/core-site.xml
SLIDER/slider-env.sh
SLIDER/log4j.properties
OOZIE_CLIENT/./
OOZIE_CLIENT/oozie-site.xml
OOZIE_CLIENT/oozie-log4j.properties
OOZIE_CLIENT/oozie-env.sh
SPARK_CLIENT/./
SPARK_CLIENT/spark-defaults.conf
SPARK_CLIENT/spark-env.sh
SPARK_CLIENT/spark-log4j.properties
SPARK_CLIENT/spark-metrics.properties
HDFS_CLIENT/./
HDFS_CLIENT/hdfs-site.xml
HDFS_CLIENT/core-site.xml
HDFS_CLIENT/log4j.properties
HDFS_CLIENT/hadoop-env.sh
FALCON_CLIENT/./
FALCON_CLIENT/falcon-env.sh
FALCON_CLIENT/runtime.properties
FALCON_CLIENT/startup.properties
HBASE_CLIENT/./
HBASE_CLIENT/hbase-policy.xml
HBASE_CLIENT/log4j.properties
HBASE_CLIENT/hbase-site.xml
HBASE_CLIENT/hbase-env.sh
INFRA_SOLR_CLIENT/./
INFRA_SOLR_CLIENT/log4j.properties
ZOOKEEPER_CLIENT/./
ZOOKEEPER_CLIENT/zookeeper-env.sh
ZOOKEEPER_CLIENT/log4j.properties
YARN_CLIENT/./
YARN_CLIENT/yarn-site.xml
YARN_CLIENT/yarn-env.sh
YARN_CLIENT/core-site.xml
YARN_CLIENT/log4j.properties
YARN_CLIENT/capacity-scheduler.xml
SQOOP/./
SQOOP/sqoop-env.sh
SQOOP/sqoop-site.xml
PIG/./
PIG/pig.properties
PIG/pig-env.sh
PIG/log4j.properties
MAPREDUCE2_CLIENT/./
MAPREDUCE2_CLIENT/core-site.xml
MAPREDUCE2_CLIENT/mapred-env.sh
MAPREDUCE2_CLIENT/mapred-site.xml
ATLAS_CLIENT/./
ATLAS_CLIENT/application.properties
ATLAS_CLIENT/atlas-log4j.xml
ATLAS_CLIENT/atlas-solrconfig.xml
ATLAS_CLIENT/atlas-env.sh
HIVE_CLIENT/./
HIVE_CLIENT/hive-site.xml
HIVE_CLIENT/hive-log4j.properties
HIVE_CLIENT/hive-exec-log4j.properties
HIVE_CLIENT/hive-env.sh

.

avatar
@Jay SenSharma

I ran same command without iv but still i am getting error while extracting tar file

tar -xvf cluster_configs.tar.gz

tar: This does not look like a tar archive gzip: stdin: not in gzip format tar: Child returned status 1 tar: Error is not recoverable: exiting now

tar -xzvf cluster_configs.tar.gz

gzip: stdin: not in gzip format

tar: Child returned status 1

tar: Error is not recoverable: exiting now

avatar
Master Mentor

@Anurag Mishra

What is the output of the following command?

# file cluster_configs.tar.gz

cluster_configs.tar.gz: gzip compressed data, from FAT filesystem (MS-DOS, OS/2, NT)

.

If your "file" command output is different then this (or like "data") then it means your binary is not downloaded properly. You can in that case try to use the same link from a browser where you are already logged in to ambari OR double check your curl command (Delete the previously downloaded file)

http://localhost:8080/api/v1/clusters/Sandbox/components?format=client_config_tar


avatar

file cluster_configs.tar.gz

cluster_configs.tar.gz: ASCII text

avatar

@Jay SenSharma what could be reason behind my tar file is different

avatar
Master Mentor

@Anurag Mishra

Please run the following command once again:

# curl -u admin:admin -H "X-Requested-By: ambari" -X GET  http://localhost:8080/api/v1/clusters/Sandbox/components?format=client_config_tar   -o /tmp/All_Configs/new_cluster_configs.tar.gz

Then check :

# file /tmp/All_Configs/new_cluster_configs.tar.gz

.