Created on 01-09-2014 10:37 AM - edited 09-16-2022 01:52 AM
Yesterday I configured a 5 node cluster via Cloudera Manager. I then added another node, and was about to make a "Gateway Template" to assign to it through the Add Node workflow, when my Firefox crashed. I loaded it back up and manually assigned all the gateway roles to it (HDFS, MapReduce, HBase, Hive).
However, when I issue "hadoop fs -ls" on the node, it basically prints out the results of "ls -la" on my local filesystem. As in, whichever folder I am in on the local filesystem, "hadoop fs -ls" will show me all the files, their permissions, and hidden files.
What is the fix for this?
Would it hurt to make another node, and instead of assigning it Gateway roles through Cloudera Manager or downloading any Cloudera Agent, to simply download the client packages for HDFS, MapReduce, Hive, HBase, Pig, Oozie, etc. and manually connect it as a "gateway" node?
Thank you.
Created 01-09-2014 10:59 AM
Hi Matthew,
After using the host template to assign gateway roles to your host, did you deploy client configuration? This command will update all the files in /etc/hadoop/conf, /etc/hive/conf, etc.
Thanks,
Darren
Created 01-09-2014 10:59 AM
Hi Matthew,
After using the host template to assign gateway roles to your host, did you deploy client configuration? This command will update all the files in /etc/hadoop/conf, /etc/hive/conf, etc.
Thanks,
Darren
Created 01-09-2014 11:05 AM
Created 01-09-2014 11:46 AM