Created 08-10-2018 07:29 PM
@Jay Kumar SenSharma
When I try to start services now I'm getting:
For HDFS Client Install
RuntimeError: Failed to execute command '/usr/bin/yum -y install hadoop_3_0_0_0_1634', exited with code '1', message: 'Error unpacking rpm package hadoop_3_0_0_0_1634-3.1.0.3.0.0.0-1634.x86_64'
For Hive Client Install
RuntimeError: Failed to execute command '/usr/bin/yum -y install hive_3_0_0_0_1634-hcatalog', exited with code '1', message: 'Error unpacking rpm package hadoop_3_0_0_0_1634-3.1.0.3.0.0.0-1634.x86_64
Created 08-10-2018 09:20 PM
@Jay Kumar SenSharma I'm definitely in a jam now. Really hoping you can help me. A bit scared to touch anything at this point.
Created 08-11-2018 12:05 AM
So I resolved all this. I just followed the steps here to remove all my packages, then deleted the contents of my files:
rm -rf /usr/hdp/Then in Ambari I used the "Start all Services" command and it went through and installed everything again for me.
/usr/hdp/3.0.0.0-.../spark2/aux/to all the other nodes in my cluster. Now all my nodemanagers are coming up and things are looking good.
Created 08-11-2018 05:10 AM
I'm glad that all sorted now another way was deleting the particular node from the cluster and then readding it and after adding spark client on it. I have recently done that one of my test cluster recently and it worked