I want to iinstall hadoop and spark multi node cluster in the docker image ,like HDP SANDBOX .
Once the vm started ,all the hadoop and spark services should have to start automatically in the docker image
What exactly i want to do means..
-->Take 3 docker images ,In that one should have to work as the masters (hadoop & spark)
-->Remaining 2 docker images should have to work as the slaves(hadoop & spark)
Once the installation is done ,and again when i start that vm the hadoop and spark services should have to start automatically.
Can anyone please suggest me how to work on this task.