Reply
Highlighted
New Contributor
Posts: 2
Registered: ‎02-12-2019

Permission error when working with spark

[ Edited ]

Hello everyone, I get a strange permission error when executing the following code.

Can anyone help me on this

 

fields = [StructField("id", IntegerType()), StructField("IntegerCol", IntegerType()),
          StructField("LongCol", LongType()), StructField("FloatCol", FloatType()),
          StructField("DoubleCol", DoubleType()), StructField("VectorCol", ArrayType(DoubleType(), True)),
          StructField("StringCol", StringType())]
schema = StructType(fields)
test_rows = [[11, 1, 23, 10.0, 14.0, [1.0, 2.0], "r1"], [21, 2, 24, 12.0, 15.0, [2.0, 2.0], "r2"]]
rdd = spark.sparkContext.parallelize(test_rows)
df = spark.createDataFrame(rdd, schema)

 

 

pyspark.sql.utils.AnalysisException: 'java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=MYUSERNAME, access=EXECUTE, inode="/tmp":hdfs:supergroup:drwxrwxrwt\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkAccessAcl(DefaultAuthorizationProvider.java:363)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:256)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:201)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:154)\n\tat org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:194)\n\tat org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)\n\tat org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3887)\n\tat org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6797)\n\tat org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4396)\n\tat org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:910)\n\tat org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:531)\n\tat org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:861)\n\tat org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)\n\tat org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)\n\tat org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)\n\tat org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)\n\tat org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)\n\tat org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)\n;'

 

 

New Contributor
Posts: 2
Registered: ‎02-12-2019

Re: Permission error when working with spark

I like to add an additional informationan.

 

My username in the error massage is reported as Lastname.Firstname

but my actual username for hadoop/cloudera is lastname.firstname

also my personal folder which i can access is /user/lastname.firstname

 

Announcements