Support Questions

Find answers, ask questions, and share your expertise

Will Ssd work to satisfy ram requirements?

The minimum requirement of Ram for both Hortonworks Hdp and hdf is 12gb. But, my laptop supports the maximum of only 8gb ram only. So, what should I do? Will use of ssd Help me in this? I mean whether the sandboxes support the ssd? The tutorial number was 820.


Super Mentor

@Divya Sodha

Ideally you should have atleast 8GB RAM allocated to the HDP Sandbox as per the doc:

Note: The Sandbox system requirements include that you have a 64 bit OS with at least 8 GB of RAM and enabled BIOS for virtualization. Find out about the newest features, known and resolved issues along with other updates on HDP and HDF from the release notes.


So your Laptop should have more than 8GB RAM for smoothly running the Sandbox so that you can allocate 8GB to the Sandbox. With Less than 8GB ram allocated to the Sandbox might cause a lots of issues, like very long time to boot, Few services like Network services to not come up properly ...etc.

@Jay Kumar SenSharma

Thank u for the reply. But, actually according to the tutorial "Analyze IOT weather station data via connected data architecture", I need to use both HDP and HDF sandboxes. So, there is requirement of minimum 12 gb Ram for both of them to work. But, the problem is my laptop has only 1 memory slot with maximum capacity of 8GB Ram. So, my question is weather the use of SSD or flash drives be helpful in this case.

Super Collaborator

@Divya Sodha

I think you may be confusing the purpose of RAM and and a hard drive. Even a "RAMDisk" is the reverse of what you are asking - putting files into your RAM.

If you need lighter resource usage, you are welcome to create your own base VM, install Amabri following the HDP documentation, then install the minimal set of components you need for your learning purposes.

The only reason the sandbox needs the 8+GB is for running the majority of the HDP components. Also, many of the Hadoop processes are running in Java, which uses a configurable heap size from the Ambari configurations. I have ran been able to run a single node Hadoop cluster within 4-6GB of RAM depending on what other services I additionally had.

Keep in mind, your OS itself needs 1-2GB of RAM.

New Contributor

If one has SSD as secondary disk (hard drove or simply C:) , one can extend windows virtual memory without much implications as SSD is much faster than Hard Drives. However I am not sure whether virtualbox will accept this windows arrangement.

Super Collaborator

Simply put: No.

And some reasoning on this:

An SSD is always connected via a disk interface (SATA, M.2 or PCIe), which makes it always a disk. All these interfaces are significantly slower than the RAM interface.

As the SSD has a fast access time, it makes swapping of memory much faster than swapping to a HDD, so you will note an increased performance in case your machine is using swap memory (virtual RAM). But the disc interface limits the throughput.

To get satisfactory response times from a virtual machine, you should have the RAM of the virtual machine assigned to physical RAM of the host machine. Otherwise a swapping initiated in the VM becomes very annoying. Just think on the possibility that the VM wants to load something from the VM's virtual memory into the VM's RAM and from there into the VM's cache, and everything could be again on the SSD. A SSD is better than a HDD, but still quite slow, you must bring quite some patience.

So if the VM needs 8 GB (as Jay stated), you should have more than 8 GB of physical RAM (8GB + OS requirement + Virtual Box requirment), which results typically in something like 12 GB, just as Jay mentioned.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.