i am trying to integrate bi tool to hortonworks and it requires me to run a command: ./cluster dist
running this command changes user to hdfs and asks for password..i have tried the root password that i changed, then tried admin, hadoop,maria_dev but none worked.
It doesn't look like you have this edge node defined as a Client to Ambari. Is this correct? If so, you may want to consider adding the node as a client in Ambari so that the proper hadoop users, client files and configuration files to support connecting to the cluster via command line are installed.
Otherwise, to get past this hurdle, you'll need to first set up a linux user, hdfs, on the edge node where you are running this command:
sudo useradd -m -g hadoop hdfs
Hi @suno bella from the root's command line prompt try switching to hdfs user by running "su - hdfs", no password is needed. Then try to run your "cluster" command. As for slave's address use the response from "hostname -f" (run as root).
H ,@Predrag Minovic, I tried running hostname -f and it gave slave's address. I was using the same address using root/hdfs but it still prompt me for password to connect and it takes none of the passwords i can think of. It prompts me like: root@slaves_address password: Can i reset or bypass this password
If from root i do ssh slaves_address it does not ask for password its just while running that command from bin folder.
Hi @Tom McCuch,
I tried these steps but it does not help. I am new to hadoop environment, that's why so much confusion So what I am trying to do is:
I have a downloaded a hortonworks and changed it's ip address to be in same network as my BI tools. Now I was trying to follow the steps mentioned to establish connection between BI tool and hadoop by following:
I am not sure about slave and master node, all I did is copying my this(
bde-1.0-22.214.171.124.1.1.0-385-SNAPSHOT-dist.tar.gz) folder to edge(master node directory)
and in slaves file(step 13) added ip address of this vm(which I changed)
Now when I follow step 17 it asks me password and I am not sure what password, its not hadoop's password that I changed, not default.
Any help would be great
Please explain how does the BI tool connects to hadoop . If it is connecting as root and then does su - hdfs . In that case,it should nt ask for password as root is a super user.
If you know the root password, then you could use the command like below one
as root user,
su - hdfs -c "/path/to/cluster/command dist"
This should login as root user and runs the command as "hdfs"
Hope this helps.
@suno bella I had same issue , the problem was I added my mount command of shared folder in /etc/profile.d/mount.sh afte I moved it to /etc/init.d/ the issue was resloved , I think the problem was as I logged in HDFS it was calling the mount.sh again .,