Support Questions

Find answers, ask questions, and share your expertise

oozie sqoop job error

avatar
Contributor
[root@hdp-m2 testData]# oozie job -auth SIMPLE -oozie http://hdp-m2:11000/oozie -config job.properties -run
Error: E0501 : E0501: Could not perform authorization operation, User: root is not allowed to impersonate root


Can somebody help me?

1 ACCEPTED SOLUTION

avatar

@da li

The answers here are close, but not quite. The proxy user settings take the form of hadoop.proxyuser.<username>.[groups|hosts]. So, in your Custom hdfs-site.xml section of Ambari, add the following two parameters:

hadoop.proxyuser.root.hosts=*
hadoop.proxyuser.root.groups=*

This will correct the impersonation error.

View solution in original post

7 REPLIES 7

avatar
@da li

This issue is observed when the proxyuser is set incorrectly for oozie. Set below properties at Ambari HDFS component > click on the Configs tab > and restart Affected components.

hadoop.proxyuser.oozie.hosts = *

hadoop.proxyuser.oozie.groups = *

avatar
Contributor

I have virified the hdfs-site.xml like this

and restarted my HDP on Ambari , then i

  1. [root@hdp-m2 testData]# oozie job -oozie http://hdp-m2:11000/oozie -config job.properties -run
  2. Error: E0501 : E0501:Couldnot perform authorization operation,User: root isnot allowed to impersonate root

but it doesn't work.

why??Do you have some other solution? Thank you very much.

avatar
Expert Contributor

@da li

hey please have a look at below

https://community.hortonworks.com/questions/153/impersonation-error-while-trying-to-access-ambari.ht...

if it helps accept the answer

You need to create the proxy settings for 'root', since Ambari runs as root. This allows it to impersonate the user in hdfs.

similar thing you need to do for oozie user , like its done for root

hadoop.proxyuser.root.groups=*

hadoop.proxyuser.root.hosts=*

avatar
Contributor

I want to load my data from mysql to HDFS, There are my files

my workflow.xml

<workflow-app xmlns="uri:oozie:workflow:0.2" name="sqoop-wf">
    <start to="sqoop-node"/>


    <action name="sqoop-node">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
                <delete path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data/sqoop"/>
                <mkdir path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data"/>
            </prepare>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <!-- <command>import --connect jdbc:mysql://XXX:3306/ph51_dcp --table ph51dcp_visit_log  --username root --password lida123321 --target-dir /user/${wf:user()}/${examplesRoot}/output-data/sqoop -m 1</command> -->
            <command>import --connect jdbc:mysql://XXX:3306/ph51_dcp --table ph51dcp_visit_log  --username root --password lida123321 --target-dir /user/sqoop-1 -m 1</command>
            <file>db.hsqldb.properties#db.hsqldb.properties</file>
            <file>db.hsqldb.script#db.hsqldb.script</file>
        </sqoop>
        <ok to="end"/>
        <error to="fail"/>
    </action>


    <kill name="fail">
        <message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>


my job.pro

# limitations under the License.
#


nameNode=hdfs://hdp-m1:8020
jobTracker=hdp-m1:8021
queueName=default
examplesRoot=examples


oozie.use.system.libpath=true


oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/sqoop
~                                                                                                                                                            
~                                                                                                                                                            
~                                                                                                                                                            
"job.properties" 26L, 996C


avatar

@da li

The answers here are close, but not quite. The proxy user settings take the form of hadoop.proxyuser.<username>.[groups|hosts]. So, in your Custom hdfs-site.xml section of Ambari, add the following two parameters:

hadoop.proxyuser.root.hosts=*
hadoop.proxyuser.root.groups=*

This will correct the impersonation error.

avatar
Master Guru

@da li - I believe you have passed impersonation issues after referring below answers. are you still facing same issue? If not then I would suggest you to accept the appropriate answer and start a new question if you have any further issues.

avatar
Contributor

Thanks everybody.