We managed to solve this issue through our own investigation. I have not seen any similar reports of this error, so I'll record what happened in case somebody in the future gets stuck on something simila On this particular deployment, we initially deployed version 22.214.171.124, then had to downgrade to 126.96.36.199 due to a java incompatibility with the SAS/Access Interface to Hadoop. Since this was a fresh deployment, we handled the downgrade by simply wiping out the existing installation, using the HostCleanup.py script and ambari-server reset command, then redeploying. There were a few artifacts left over that caused warnings during configuration, but we were able to remove those and continue deploying. Once it was fully deployed (and validated operational), we went to deploy the SAS Embedded Process, an extra service, and found that any attempt to change the server configuration was causing this glitch in the UI where we could not adjust the data directory away from /home/hadoop/hdfs/data, which is an invalid storage location. We discovered that the original data directory, in /hadoop/hdfs/data, had not been wiped out after the initial deployment and still had the previous deployment's codes on all the folders. In order to reset this directory, we moved each data directory to /hadoop/hdfs/data.old, then rebooted the data nodes to get clean folders with the correct names that the name nodes expected. For whatever reason, this mismatch in the folder name was causing the configuration UI to freak out and replace our changes with the default settings. Getting the correct data folders in place fixed our issue with the UI.
... View more