Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

VMWare download

VMWare download

New Contributor

After failing to get HDP 2.5 working on Oracle Virtual Box (downloaded the huge .ova file twice, imported twice), I would like to try to run it in VMWare, but cannot seem to find a download link for it on sandbox page.

Is their a free version of VMWare just like Oracle Virtual Box that someone could point me to ?

7 REPLIES 7

Re: VMWare download

Hi @Navin Ladda. This is what I've used. http://www.vmware.com/products/player/playerpro-evaluation.html

What error are you getting with the importing the .ova file? I've always used Virtual Box and haven't had any problems.

Re: VMWare download

New Contributor

I did not get any errors importing the .ova file in Virtual Box. I just cannot get the Ambari URL 127.0.0.1:8080 to accept the user credentials mentioned in the "Learning the ropes of sandbox" tutorial web pages.

I can get to the Ranger UI and also to 127.0.0.1:8888 Welcome page. But I cannot launch the dashboard link. I have to go to the Quick Links under Advanced Settings to see the URLs for the remaining components of HDP.

Also, sometimes when the machine starts and I login with root/hadoop and do a ls command, I see just 2 or 3 files. And the ambari-admin password reset script is missing. I have a Windows 7 machine with 8 gb RAM and I have the Network type set to NAT. About 5 gb is assigned to the VM image.

The CentOS Linux 7 that the VM loads takes long time and it does not matter if I close all other applications on the machine. I have HDP 2.1 installed and that works fine, but that is now very old, so want to get HDP 2.5 working.

7825-capture.jpg

Re: VMWare download

Guru

@Scott Shaw Is 5 GB allocated to VM sufficient. I was under the impression that 8 GB was required for running Ambari (??)

Re: VMWare download

New Contributor

My impression was that 8 GB is the total required on the machine. Sandbox specs page does not say how to be allocated to the VM. What is wierd is that when I login to the terminal via the Virtual box, I can get in with root/hadoop. And it never asks to change the password. And when I do a ls command, I just see the one file and folder, in the screenshot above.

But if I use putty and login to 127.0.0.1 on port 2222, with root/hadoop, it asks to change password and I see different set of files and I am even able to change the ambari admin password. But after that, I try to login to 127.0.0.1:8080 with raj_ops or maria_dev or even the new ambari admin user with new password, it does not login. I even tried the ip shown under inet eth0 in the screenshot below and it does not even throw the login dialog box. So frustrating !!

Meanwhile the VMware 11.1 GB download is still going on.

7826-capture.jpg

Re: VMWare download

@Navin Ladda For reference I have 8 GB RAM, 4 CPU, 48 GB configured for my sandbox instance.

You can sign in as raj_ops\raj_ops and that should allow you to manage the sandbox via Ambari.

To login into the sandbox through CLI type: ssh root@127.0.0.1 -p 2222

Type in "hadoop" as the password and it should then prompt you to change the password. From there you should be able to run the ambari password reset command.

As far as the Dashboard link, you may need to make sure you have popup blocker turned off on your browser. I turned mine off but it required a browser restart in order to finally work.

Re: VMWare download

New Contributor

@Scott Shaw

Unfortunately, I have tried all what you mention, but I cannot login. My machine has only 8 GB RAM, so I cannot fully allocate that to the VM. I did finish downloading VMWare, but the VM under that does not even start after a long time. It is stuck on the bar that shows Cent OS loading....

I give up, the .ova file sizes have consistently gone up since HDP 2.1 and now reached 11 gb. So looks like the RAM requirements are also higher for it to work properly and an ordinary laptop is not going to be enough.

Highlighted

Re: VMWare download

Super Guru

i am allocating 8 GB on my 16 GB MacBook and it works fine.

You can also run an image on Azure or AWS pretty cheap.

Also if you have an older desktop or laptop, you can use

There's a lot of Apache projects in there and a lot of services to run to get all the features. You are trying to run a Big Data massively parallel system with a lot of monitoring, management and services in one little VM. It's a lot of stuff. For Spark, when I run jobs, 4GB is really minimum. So 8Gb is tough. You could allocate as much of your 8GB as you can and don't run other apps/browsers while running. Or setup a cheap Ubuntu or Centos box as a 1 server cluster. Those are cheap with 8GB of RAM or more for an older or used server.

I have sometimes had to rename OVF to OVA files to work.

If you want to run a minicluster to try things: https://github.com/sakserv/hadoop-mini-clusters

Don't have an account?
Coming from Hortonworks? Activate your account here