On a cluster I am downloading the client configs via Ambari UI/API. How can I make sure all the client configs are actually downloaded? Or in other words, how do i know what are the config files for a client component?
If you are running Hadoop/Hdfs CLI, you can always specify it with --config confdir option with the command. This will be populated via HADOOP_CONF_DIR environment variable and added to JVM CLASSPATH.
If you are running your own Java client, you can check the JVM CLASSPATH and ensure the proper client configuration location is added in the right order.
@Xiaoyu Yao, thanks for the reply. What i was looking for is how to identify client specific files in a conf dir.
eg: in /etc/hadoop/conf we have multiple config files. How can i find which ones are server specific and which ones are client specific?
@dbalasundaran , Part of the point of using Ambari is so you don't need to worry about which files a client needs, which may change from release to release. The GUI allows you to specify where the downloaded tarball goes, and both Ambari and the tar extraction command will complain if the tarball file is corrupted or incomplete. So you should be able to use it with confidence.
Of course it's not beyond possibility that the Ambari specification could be buggy, so if you think you're seeing a wrong result in your client download, by all means bring it up.
Have you seen a problem that makes you concerned? Or are you just trying to be thorough? Thanks.
@Matt FoleyThanks for your reply. I have not seen any issues yet. But was wondering how can i make sure it has indeed downloaded all client configs. So one way would be to understand the client config files for a service. Hence the question. Yeah, it was out of curiosity :)
For most of the services in ambari you will find the "Service Action" Drop down button has "Download Client Config" option that is useful to download the configurations that are needed for a client.