04-09-2018 10:20 AM
Not sure I get your intension to have multiple datanodes on the same machine
if you want to store data nodes in different/multiple directories in the same machine then you can use CM -> HDFS -> Configuration -> datanode.data.dir and specify your directories
04-11-2018 11:14 AM
Hadoop wasn't designed to run multiple DataNodes on a single host and is prohibited by Cloudera Manager.
The reason for a single DataNode per host is to prevent data loss. Using the default replication factor of 3, every block in a file will be replicated to 3 different hosts. If a host containing a block replica were to go down, the NameNode will mark the block as under-replicated. A new copy of the block will be created on another DataNode bringing the number of replicas back to 3.
If you do not care about data integrity my suggestion is to set the replication factor to 1 or use virtual hosts.