<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Permission Error while running spark-shell in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49447#M51593</link>
    <description>&lt;P&gt;No, you definitely do not want to take this dir away from hdfs! in general I'd never mess with the HDFS permissions for key dirs like this. Instead, hdfs needs to make a directory for your user. This kind of stuff happens automatically via Hue.&lt;/P&gt;</description>
    <pubDate>Sun, 15 Jan 2017 18:13:25 GMT</pubDate>
    <dc:creator>srowen</dc:creator>
    <dc:date>2017-01-15T18:13:25Z</dc:date>
    <item>
      <title>Permission Error while running spark-shell</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49439#M51590</link>
      <description>&lt;P&gt;Hello All,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;While running spark-shell i am getting the below permission error. Can anybody help me out with this? I have just installed cloudera manager with core hadoop + Spark on Centos6.2 with 20GB RAM. CDH 5.8 and hadoop 2.6 version.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;To adjust logging level use sc.setLogLevel(newLevel).&lt;BR /&gt;Welcome to&lt;BR /&gt;____ __&lt;BR /&gt;/ __/__ ___ _____/ /__&lt;BR /&gt;_\ \/ _ \/ _ `/ __/ '_/&lt;BR /&gt;/___/ .__/\_,_/_/ /_/\_\ version 1.6.0&lt;BR /&gt;/_/&lt;/P&gt;&lt;P&gt;Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)&lt;BR /&gt;Type in expressions to have them evaluated.&lt;BR /&gt;Type :help for more information.&lt;BR /&gt;17/01/15 09:45:53 ERROR spark.SparkContext: Error initializing SparkContext.&lt;BR /&gt;org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergrou p:drwxr-xr-x&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvid er.java:281)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:242)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider .java:169)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6621)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6603)&lt;BR /&gt;at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6555)&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 10:54:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49439#M51590</guid>
      <dc:creator>justin3113</dc:creator>
      <dc:date>2022-09-16T10:54:47Z</dc:date>
    </item>
    <item>
      <title>Re: Permission Error while running spark-shell</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49444#M51591</link>
      <description>&lt;P&gt;That's the general error you get when you run as user foo, but you haven't set up /user/foo in HDFS, and the usual way that is done is through Hue or syncing with something like Active Directory.&lt;/P&gt;</description>
      <pubDate>Sun, 15 Jan 2017 10:34:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49444#M51591</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2017-01-15T10:34:39Z</dc:date>
    </item>
    <item>
      <title>Re: Permission Error while running spark-shell</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49445#M51592</link>
      <description>&lt;P&gt;Do you mean /usr/root?&amp;nbsp;&lt;/P&gt;&lt;P&gt;I was able to overcome the issue by below commands.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;su - hdfs&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;hdfs dfs -chown -R root:hdfs /user&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;exit&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 15 Jan 2017 16:42:37 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49445#M51592</guid>
      <dc:creator>justin3113</dc:creator>
      <dc:date>2017-01-15T16:42:37Z</dc:date>
    </item>
    <item>
      <title>Re: Permission Error while running spark-shell</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49447#M51593</link>
      <description>&lt;P&gt;No, you definitely do not want to take this dir away from hdfs! in general I'd never mess with the HDFS permissions for key dirs like this. Instead, hdfs needs to make a directory for your user. This kind of stuff happens automatically via Hue.&lt;/P&gt;</description>
      <pubDate>Sun, 15 Jan 2017 18:13:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49447#M51593</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2017-01-15T18:13:25Z</dc:date>
    </item>
    <item>
      <title>Re: Permission Error while running spark-shell</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49490#M51594</link>
      <description>&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/19677"&gt;@justin3113&lt;/a&gt; to run jobs across all nodes a user must exist on each node, I'd justin3113 for example. And each user needs a HDFS user directory under /user in HDFS, the user must have read and write access. This is so the job can write temporary data to HDFS from whatever node the job is running. The error is stating that it is trying to create that user directory but only the hdfs user has that permission. Opening up access gets around it but that is not advisable. You should run for each user su - hdfs hdfs dfs -mkdir /user/justin3113.</description>
      <pubDate>Tue, 17 Jan 2017 05:06:25 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Permission-Error-while-running-spark-shell/m-p/49490#M51594</guid>
      <dc:creator>mbigelow</dc:creator>
      <dc:date>2017-01-17T05:06:25Z</dc:date>
    </item>
  </channel>
</rss>

