I want to run start a coordinator job, but I received the following error:
Error: E0501 : E0501: Could not perform authorization operation, Unauthorized connection for super-user: oozie from IP X.X.X.X
then, i added the properties in core-site.xml file, and restart the cluster, but didn't accomplish anything.
<property <name>hadoop.proxyuser.oozie.hosts</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.oozie.groups</name> <value>*</value> </property>
How can i solve it ?
Also make sure that the directories (absolute path) that contains the workflow.xml also has at least 755, so that the user is able to get to the file and then read it.
@bsaini : I am trying out this tutorial mirroring-datasets-between-hadoop-clusters-with-apache-falcon You said that the path workflow.xml should have at least 755.I made these changes at 2 locations 1) /apps/data-mirroring/*.xml 2) /apps/falcon/backupCluster/staging/falcon/workflows/process/MirrorTest/cf29a6898f4d78c4515a7d0b22f51b6e_1454333601227/DEFAULT/ Now when I am running the Mirror Test, I am getting the following exception : Caused by: org.apache.hadoop.security.AccessControlException: Permission denied. user=ambari-qa is not the owner of inode=MirrorTest .What is going wrong ??
@bsaini @Artem Ervits : After making changes in core-site.xml Whenever I follow the steps,I see this exception occurs:Unauthorized connection for super-user: oozie from IP X.X.X.X & when I change its ownership to any other user then org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied. user=ambari-qa is not the owner of inode=MirrorTest occurs..After that if I change its ownership to "ambari-qa",then I again get Unauthorized connection for super-user: oozie from IP X.X.X.X.. Is there anything or a location I am missing some where wrong ?
After the HDP installation 2.4, I had issue with similar error as
Unauthorized connection for super-user: oozie from IP..
As suggested above i made changes on proxyuser.oozie.hosts & group to *.
Additionally I had to copy shared library file to HDFS.
tar xvf <HDP_install_dir>/oozie/oozie-sharelib.tar.gz
sudo -u oozie hadoop fs -put share /user/oozie/share
Now you can start Oozie from Ambari.
Another trouble shooting check is to find whether the DB is perfectly configured as per DB name,user & password specified in /etc/oozie/conf/oozie-site.xml
sudo -u oozie /usr/lib/oozie/bin/ooziedb.sh create -run
If it is perfectly configured it states that DB is configured.
In my installation I had to grant DB privileges to oozie user..