@Jay Kumar SenSharma
When I try to start services now I'm getting:
For HDFS Client Install
RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_0_0_0_1634', exited with code '1', message: 'Error unpacking rpm package hadoop_3_0_0_0_1634-188.8.131.52.0.0.0-1634.x86_64'
For Hive Client Install
RuntimeError: Failed to execute command '/usr/bin/yum -y install hive_3_0_0_0_1634-hcatalog', exited with code '1', message: 'Error unpacking rpm package hadoop_3_0_0_0_1634-184.108.40.206.0.0.0-1634.x86_64
So I resolved all this. I just followed the steps here to remove all my packages, then deleted the contents of my files:
rm -rf /usr/hdp/Then in Ambari I used the "Start all Services" command and it went through and installed everything again for me.
/usr/hdp/220.127.116.11-.../spark2/aux/to all the other nodes in my cluster. Now all my nodemanagers are coming up and things are looking good.
I'm glad that all sorted now another way was deleting the particular node from the cluster and then readding it and after adding spark client on it. I have recently done that one of my test cluster recently and it worked