Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

What to do if my laptop does not support the minimum requirement of 12Gb Ram for Hortonworks HDP and HDF sandboxes both to work together?

avatar

According to the tutorial "Analyze IOT weather station data via connected data architecture", I need to use both HDP and HDF sandboxes. So, there is requirement of minimum 12 gb Ram for both of them to work. But, the problem is my laptop has only 1 memory slot with maximum capacity of 8GB Ram. So, my question is weather the use of SSD or flash drives be helpful in this case. Or, whether some other alternatives are available.

1 ACCEPTED SOLUTION

avatar

Hey @Divya Sodha, good concern. We're working on a slight redesign of this tutorial to ameliorate at least part of the issue of needing as much RAM, though some of the services are simply so beefy that there is a lower bound to required memory.

One alternative to getting by with less memory for now is stopping any unused services via Ambari - that'll provide the largest savings. Pinging the designer of this particular tutorial, @jmedel, to see about including this as a step in the tutorial for the next update.

Regarding your other question, about using SSD/flash drives would help - unfortunately, not really. The bottleneck here is RAM, and while increased disk space would help with swap, performance gains are very likely minimal considering average available disk space.

Another alternative would be to leverage a single, large, platform rather than two sandboxes (HDP + HDF). I'll bring this up to the tutorial designer and get a discussion going.

View solution in original post

2 REPLIES 2

avatar

Hey @Divya Sodha, good concern. We're working on a slight redesign of this tutorial to ameliorate at least part of the issue of needing as much RAM, though some of the services are simply so beefy that there is a lower bound to required memory.

One alternative to getting by with less memory for now is stopping any unused services via Ambari - that'll provide the largest savings. Pinging the designer of this particular tutorial, @jmedel, to see about including this as a step in the tutorial for the next update.

Regarding your other question, about using SSD/flash drives would help - unfortunately, not really. The bottleneck here is RAM, and while increased disk space would help with swap, performance gains are very likely minimal considering average available disk space.

Another alternative would be to leverage a single, large, platform rather than two sandboxes (HDP + HDF). I'll bring this up to the tutorial designer and get a discussion going.

avatar

@Edgar Orendain

Thank you so much Sir for your guidance and response. As you have said, I'll try to learn Ambari and see if anything happens by it.

I hope, the changes in tutorial are made as early as possible as it will help me a lot. I would have tried it but the need of costly materials for the implementation of the tutorial is acting as barrier.

I was just wondering how and what to use as a single platform instead of two sandboxes- HDF and HDP.

Once again, really thank u Sir.

@jmedel

Sir please reply me whenever the changes are made if done.

It done quickly, it will help me to do my be final year project. It will help me a lot Sir.

Thanks for the support.