Reply
Explorer
Posts: 32
Registered: ‎11-24-2015

Yarn security issues

I am trying to setup kerberos with yarn but facing issues when starting nodemanager. Appreciate it if somebody could help sort this out.

 

User is hitex:hitex for hdfs and yarn:hitex for yarn.

 

below are the relevant configurations and error log :

-rwxr-xr-x 1 root hitex 159223 Sep 17 02:48 container-executor

-rwxrwx--- 1 yarn hitex 286 Nov 27 11:36 yarn.keytab
-rw-r--r-- 1 hitex hitex 3068 Dec 1 14:40 yarn-site.xml
-r-------- 1 root hitex 249 Dec 1 14:46 container-executor.cfg


drwxr-xr-x 2 yarn hitex 4096 Nov 27 14:23 nodemanager_dir
drwxr-xr-x 2 yarn hitex 4096 Dec 1 14:20 nodemanager_log_dir


yarn@hadoop2:/home/hitex/Desktop/Hadoop/hadoop-2.6.1/etc/hadoop$ cat container-executor.cfg
yarn.nodemanager.local-dirs=/home/hitex/Desktop/Hadoop/nodedata/nodemanager_dir
yarn.nodemanager.log-dirs=/tmp/nodemanager_log
yarn.nodemanager.linux-container-executor.group=yarn
banned.users=hitex,yarn
min.user.id=1000
allowed.system.users=##comma separated list of system users who CAN run applications

 

Error log :

2015-12-01 15:35:54,079 INFO org.apache.hadoop.yarn.server.nodemanager.NodeManager: registered UNIX signal handlers for [TERM, HUP, INT]
2015-12-01 15:36:01,333 WARN org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor: Exit code from container executor initialization is : 24

ExitCodeException exitCode=24: File /home/hitex/Desktop/Hadoop/hadoop-2.6.1/etc/hadoop must be owned by root, but is owned by 1000

at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.init(LinuxContainerExecutor.java:181)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceInit(NodeManager.java:211)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.initAndStartNodeManager(NodeManager.java:480)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:527)
2015-12-01 15:36:01,347 INFO org.apache.hadoop.yarn.server.nodemanager.ContainerExecutor:
2015-12-01 15:36:01,347 INFO org.apache.hadoop.service.AbstractService: Service NodeManager failed in state INITED; cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to initialize container executor
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to initialize container executor
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceInit(NodeManager.java:213)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.initAndStartNodeManager(NodeManager.java:480)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:527)
Caused by: java.io.IOException: Linux container executor not configured properly (error=24)
at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.init(LinuxContainerExecutor.java:187)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceInit(NodeManager.java:211)
... 3 more
Caused by: ExitCodeException exitCode=24: File /home/hitex/Desktop/Hadoop/hadoop-2.6.1/etc/hadoop must be owned by root, but is owned by 1000

at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.init(LinuxContainerExecutor.java:181)
... 4 more

Posts: 1,730
Kudos: 357
Solutions: 274
Registered: ‎07-31-2013

Re: Yarn security issues

What user do you start the NodeManager as? The user hitex, or the user root (via sudo)?

 

This is what a regular setup looks like on a Cloudera Manager installed Parcel-based CDH (which I highly recommend you use, cause it removes all this misconfig pain away):

 

The container-executor binary:

 

~> namei -l /opt/cloudera/parcels/CDH/lib/hadoop-yarn/bin/container-executor
f: /opt/cloudera/parcels/CDH/lib/hadoop-yarn/bin/container-executor
dr-xr-xr-x root         root         /
drwxr-xr-x root         root         opt
drwxr-xr-x cloudera-scm cloudera-scm cloudera
drwxr-xr-x root         root         parcels
lrwxrwxrwx root         root         CDH -> CDH-5.5.0-1.cdh5.5.0.p0.8
drwxr-xr-x root         root           CDH-5.5.0-1.cdh5.5.0.p0.8
drwxr-xr-x root         root         lib
drwxr-xr-x root         root         hadoop-yarn
drwxr-xr-x root         root         bin
---Sr-s--- root         yarn         container-executor

The container-executor config:

 

~> namei -l /etc/hadoop/conf/container-executor.cfg
f: /etc/hadoop/conf/container-executor.cfg
dr-xr-xr-x root root   /
drwxr-xr-x root root   etc
drwxr-xr-x root root   hadoop
drwxr-xr-x root root   conf
-r--r--r-- root hadoop container-executor.cfg

In CM's regular setup, the parcel is entirely owned by "root", but the NodeManager is executed as "yarn"

 

Explorer
Posts: 12
Registered: ‎09-25-2014

Re: Yarn security issues

I am facing a similar issue, what is the best way to fix it?

 

Added a node back to the cluster after a long while, and encountered the same error. All other nodes seems to be working fine. 

Explorer
Posts: 32
Registered: ‎11-24-2015

Re: Yarn security issues

[ Edited ]

it has been a while but i think the node should be started as root due to reserved port usage. you might want to check how other nodes (in your cluster) have been started - ie using which user. btw please note that I was using apache hadoop and not cloudera.

Explorer
Posts: 12
Registered: ‎09-25-2014

Re: Yarn security issues

[ Edited ]

I had to delete the NodeManager role from the host add it back and do a "Deploy Client Configuration" from CM. It worked after that.

Highlighted
New Contributor
Posts: 1
Registered: ‎09-13-2018

Re: Yarn security issues

This is the perfect solution for this problem.I tried this solution and it is working fine.I deleted the Node manager and then restarted Yarn client configuration for the whole yarn and then restarted the node manager then it is working fine.

Announcements