Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

HDFS password



i am trying to integrate bi tool to hortonworks and it requires me to run a command: ./cluster dist

running this command changes user to hdfs and asks for password..i have tried the root password that i changed, then tried admin, hadoop,maria_dev but none worked.

Any suggestions



It doesn't look like you have this edge node defined as a Client to Ambari. Is this correct? If so, you may want to consider adding the node as a client in Ambari so that the proper hadoop users, client files and configuration files to support connecting to the cluster via command line are installed.

Otherwise, to get past this hurdle, you'll need to first set up a linux user, hdfs, on the edge node where you are running this command:

sudo useradd -m -g hadoop hdfs


its not a client to ambari. How can i find ip for slave nodes and also bypass password security

Master Guru

Hi @suno bella from the root's command line prompt try switching to hdfs user by running "su - hdfs", no password is needed. Then try to run your "cluster" command. As for slave's address use the response from "hostname -f" (run as root).


H ,@Predrag Minovic, I tried running hostname -f and it gave slave's address. I was using the same address using root/hdfs but it still prompt me for password to connect and it takes none of the passwords i can think of. It prompts me like: root@slaves_address password: Can i reset or bypass this password

If from root i do ssh slaves_address it does not ask for password its just while running that command from bin folder.

@suno bella: Passwordless SSH with slave nodes is typically only set up by the sysadmin for root during the Preparing the Environment step of the Ambari install: If you wanted to do this for a different user than root, you'd need to follow the same instructions as in the link above, except while logged in or su'd as that user.


Hi @Tom McCuch,

I tried these steps but it does not help. I am new to hadoop environment, that's why so much confusion So what I am trying to do is:

I have a downloaded a hortonworks and changed it's ip address to be in same network as my BI tools. Now I was trying to follow the steps mentioned to establish connection between BI tool and hadoop by following:

I am not sure about slave and master node, all I did is copying my this(


) folder to edge(master node directory)

and in slaves file(step 13) added ip address of this vm(which I changed)

Now when I follow step 17 it asks me password and I am not sure what password, its not hadoop's password that I changed, not default.

Any help would be great




@suno bella

Please explain how does the BI tool connects to hadoop . If it is connecting as root and then does su - hdfs . In that case,it should nt ask for password as root is a super user.

If you know the root password, then you could use the command like below one

as root user,

su - hdfs -c "/path/to/cluster/command dist"

This should login as root user and runs the command as "hdfs"

Hope this helps.

New Contributor

@suno bella I had same issue , the problem was I added my mount command of shared folder in /etc/profile.d/ afte I moved it to /etc/init.d/ the issue was resloved , I think the problem was as I logged in HDFS it was calling the again .