Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to get disk info using Ambari API?

avatar
Expert Contributor

Hi. I got a question which is associated with Ambari API. I want to run a script hdp-configuration-utils, but I need a couple of information - number of cores, memory, disks and HBase enabled (I did not install it so value is 'False').

My questions:

1. When I run command:

GET api/v1/clusters/c1/hosts

I get parameter names 'cpu_count' and 'ph_cpu_count'. Which one should I use?

2. How can I check what is number of disks?

3. How can I get info about free and total disk size? I got two parameters:

- disk_info

"disk_info" : [
      {
        "available" : "42331676",
        "device" : "/dev/mapper/VolGroup-lv_root",
        "used" : "6521952",
        "percent" : "14%",
        "size" : "51475068",
        "type" : "ext4",
        "mountpoint" : "/"
      },
      {
        "available" : "423282",
        "device" : "/dev/sda1",
        "used" : "38770",
        "percent" : "9%",
        "size" : "487652",
        "type" : "ext4",
        "mountpoint" : "/boot"
      },
      {
        "available" : "45423700",
        "device" : "/dev/mapper/VolGroup-lv_home",
        "used" : "53456",
        "percent" : "1%",
        "size" : "47917960",
        "type" : "ext4",
        "mountpoint" : "/home"
      }
    ]

- metrics/disk

"disk" : {
      "disk_free" : 83.99,
      "disk_total" : 95.25,
      "read_bytes" : 1.9547998208E10,
      "read_count" : 1888751.0,
      "read_time" : 2468451.0,
      "write_bytes" : 1.5247885312E10,
      "write_count" : 2020357.0,
      "write_time" : 9.9537697E7
    }

Which one should I check when I want to compare it with offcial sizing recomendations?

1 ACCEPTED SOLUTION

avatar
Super Collaborator

@Mateusz Grabowski

1. From the Ambari agent implementation, it seems both the counts are using the same python API - https://docs.python.org/2/library/multiprocessing.html#multiprocessing.cpu_count. Hence they should have the same value. Are you seeing different values in your API response? If not, I can create a jira to track this bug.

2. There does not seem to be a direct entry for that. You probably have to use the length of the 'disk_info' array in the JSON response to determine that.

3. Both of them are giving the same information. The first one is per disk stats in KB, and the 2nd one is total disk stats in GB.

View solution in original post

3 REPLIES 3

avatar
Super Collaborator

@Mateusz Grabowski

1. From the Ambari agent implementation, it seems both the counts are using the same python API - https://docs.python.org/2/library/multiprocessing.html#multiprocessing.cpu_count. Hence they should have the same value. Are you seeing different values in your API response? If not, I can create a jira to track this bug.

2. There does not seem to be a direct entry for that. You probably have to use the length of the 'disk_info' array in the JSON response to determine that.

3. Both of them are giving the same information. The first one is per disk stats in KB, and the 2nd one is total disk stats in GB.

avatar
Expert Contributor

@Aravindan Vijayan

1. I see the same values, so it works good.

3. But when I added sizes of 3 disks from the first one I got 99880680 KB. The second one shows 95.25. What is the reason of the difference between them?

avatar
Super Collaborator

@Mateusz Grabowski

The difference could be due to when the disk stats were measured. The stats in KB are coming from the ambari agent and the the one in GB is coming from Ambari Metrics Service.