- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
oozie sqoop job error
- Labels:
-
Apache Sqoop
Created ‎08-24-2016 09:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
[root@hdp-m2 testData]# oozie job -auth SIMPLE -oozie http://hdp-m2:11000/oozie -config job.properties -run Error: E0501 : E0501: Could not perform authorization operation, User: root is not allowed to impersonate root
Can somebody help me?
Created ‎08-24-2016 02:26 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The answers here are close, but not quite. The proxy user settings take the form of hadoop.proxyuser.<username>.[groups|hosts]. So, in your Custom hdfs-site.xml section of Ambari, add the following two parameters:
hadoop.proxyuser.root.hosts=* hadoop.proxyuser.root.groups=*
This will correct the impersonation error.
Created ‎08-24-2016 09:18 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This issue is observed when the proxyuser is set incorrectly for oozie. Set below properties at Ambari HDFS component > click on the Configs tab > and restart Affected components.
hadoop.proxyuser.oozie.hosts = *
hadoop.proxyuser.oozie.groups = *
Created ‎08-24-2016 09:50 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have virified the hdfs-site.xml like this
and restarted my HDP on Ambari , then i
- [root@hdp-m2 testData]# oozie job -oozie http://hdp-m2:11000/oozie -config job.properties -run
- Error: E0501 : E0501:Couldnot perform authorization operation,User: root isnot allowed to impersonate root
but it doesn't work.
why??Do you have some other solution? Thank you very much.
Created ‎08-24-2016 09:20 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hey please have a look at below
if it helps accept the answer
You need to create the proxy settings for 'root', since Ambari runs as root. This allows it to impersonate the user in hdfs.
similar thing you need to do for oozie user , like its done for root
hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=*
Created ‎08-24-2016 09:55 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I want to load my data from mysql to HDFS, There are my files
my workflow.xml
<workflow-app xmlns="uri:oozie:workflow:0.2" name="sqoop-wf"> <start to="sqoop-node"/> <action name="sqoop-node"> <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <prepare> <delete path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data/sqoop"/> <mkdir path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data"/> </prepare> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> </configuration> <!-- <command>import --connect jdbc:mysql://XXX:3306/ph51_dcp --table ph51dcp_visit_log --username root --password lida123321 --target-dir /user/${wf:user()}/${examplesRoot}/output-data/sqoop -m 1</command> --> <command>import --connect jdbc:mysql://XXX:3306/ph51_dcp --table ph51dcp_visit_log --username root --password lida123321 --target-dir /user/sqoop-1 -m 1</command> <file>db.hsqldb.properties#db.hsqldb.properties</file> <file>db.hsqldb.script#db.hsqldb.script</file> </sqoop> <ok to="end"/> <error to="fail"/> </action> <kill name="fail"> <message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end"/> </workflow-app>
my job.pro
# limitations under the License. # nameNode=hdfs://hdp-m1:8020 jobTracker=hdp-m1:8021 queueName=default examplesRoot=examples oozie.use.system.libpath=true oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/apps/sqoop ~ ~ ~ "job.properties" 26L, 996C
Created ‎08-24-2016 02:26 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The answers here are close, but not quite. The proxy user settings take the form of hadoop.proxyuser.<username>.[groups|hosts]. So, in your Custom hdfs-site.xml section of Ambari, add the following two parameters:
hadoop.proxyuser.root.hosts=* hadoop.proxyuser.root.groups=*
This will correct the impersonation error.
Created ‎08-24-2016 06:17 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@da li - I believe you have passed impersonation issues after referring below answers. are you still facing same issue? If not then I would suggest you to accept the appropriate answer and start a new question if you have any further issues.
Created ‎08-25-2016 01:22 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks everybody.
