Created on 03-02-2017 09:07 PM - edited 09-16-2022 04:11 AM
i have been able to deployed hortonworks on azure and when i ssh -p 2222 root@localhost. it asked me for the root password and when i used entered the password i used to ssh through putty to port 22, it is given error permission denied and it the same password i entered when creating my VM. So am looking forward to anyone that can help me on how to go about solve this issues because and just starting to learn this HDP. thanks
Created 03-02-2017 09:17 PM
If this is the HDP sandbox, root password is hadoop.
But if you installed HDP via IaaS on Azure, the root password is as you defined it in Azure (if you chose the password, non-ssh option). That is also assuming you defined the 'root' account as the password user.
The other option in Azure is use ssh, and thereby you will need to download the pem file, to access the host.
Created 03-02-2017 10:11 PM
Heya, you mentioned deploying on Azure, but are SSH'ing into localhost? Try following this and see if it helps: https://hortonworks.com/hadoop-tutorial/port-forwarding-azure-sandbox/
Created 03-02-2017 11:12 PM
@Edgar Orendain. when i tried what was suggested from the link this was the error message i got:
bind: Address already in use channel_setup_fwd_listener: cannot listen to port: 8080
bind: Address already in use channel_setup_fwd_listener: cannot listen to port: 8888
bind: Address already in use channel_setup_fwd_listener: cannot listen to port: 9995
bind: Address already in use channel_setup_fwd_listener: cannot listen to port: 9996
bind: Address already in use channel_setup_fwd_listener: cannot listen to port: 8886
bind: Address already in use channel_setup_fwd_listener: cannot listen to port: 10500
bind: Address already in use channel_setup_fwd_listener: cannot listen to port: 4200
bind: Address already in use channel_setup_fwd_listener: cannot listen to port: 2222
Could not request local forwarding.
Created 03-02-2017 11:25 PM
Do you already have an active connection to Azure open? Sounds like the ports are already bounded as they should be from another place. Check for active SSH connections, or sandboxes actively running on Virtualbox/Docker.
Created 03-20-2017 04:55 AM
@Adedayo Adekeye I am facing the same issue, Could not request local forwarding. Did you get the issue resolved? If so could you please help me out?
Created 03-03-2017 01:09 AM
@Edgar Orendain. i just tried to deployed and hdp on azure and after the deployment i tried to the forwarding put without opening it the port first on azure vm security first apart from the default ssh but this was the error i got:
"~/.ssh/config" E212: Can't open file for writing
Created 03-03-2017 01:27 AM
Have you tried writing to that file as superuser? E.g. "sudo vi ~/.ssh/config" ?
Created 03-03-2017 01:35 AM
@Edgar Orendain. yes the same error message.
Created 03-03-2017 01:46 AM
I'd be sure that the file isn't currently open or used anywhere. You mentioned before that you were able to get this file created and run the suggested SSH command, so something may have happened between then and now that's stopping you from making changes to this file.
Created 03-03-2017 01:59 AM
@Edgar Orendain. But the VM was just created and if i should open it on azure it will work but i just tried what you suggested. if i add them to the inbound on my vm on azure it works but having issues with running the command line commands because am not able to sign in as root user unto hdfs
Created 03-03-2017 02:14 AM
Not sure about the issue with HDFS, I've been focusing on getting you connected to the VM on Azure via SSH. In the topic, you mentioned having issues SSH'ing into localhost, but it sounds like you were able to open ports and SSH into the VM? That's a viable alternative to the linked tutorial, so that's great. If that's the case, are you able to navigate the sandbox freely and now just running into HDFS issues?
Unless you open up several other ports or adjust Azure's security groups, you may run into issues later with other closed ports when connecting from the outside (i.e. your local machine). The tutorial linked above deals with that specifically.
Created 05-08-2017 01:32 AM
root password is hadoop. you can try on on the browser terminal http://host:4200/, it should ask for a change of password
,root password for sandbox (Azure/VM) is hadoop. if it doesn't work from ssh, try to open the terminal on browser (http://host:4200/)
enter username :root & password: hadoop.
it would ask you to change root password.
Created 10-21-2017 06:06 PM
I'm running into similar issue...
I have a cluster on AWS Data Cloud. I'm able to ssh into the master node, however it's specified that we need to ssh with user "cloudbreak". After I'm in, and i do su, it's prompts me to password, but I never created root password.
I need root to install pandas, and you can only do it with root...
Any ideas?
Created 01-08-2019 05:13 AM
cloudbreak user should have su access. after logging in as cloudbreak do 'sudo su'
Created 02-20-2020 10:04 AM
username: root
Password: hadoop