Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Exception while connecting hive and sparkThe root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x

Solved Go to solution
Highlighted

Exception while connecting hive and sparkThe root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x

Explorer
0

I m using Spark 2.4.5, Hive 3.1.2, Hadoop 3.2.1. While running hive in spark i got the following exception,

Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x; 

This is my source code,

package com.spark.hiveconnect

import java.io.File

import org.apache.spark.sql.{Row, SaveMode, SparkSession}

object sourceToHIve {
  case class Record(key: Int, value: String)
  def main(args: Array[String]){
    val warehouseLocation = new File("spark-warehouse").getAbsolutePath

    val spark = SparkSession
      .builder()
      .appName("Spark Hive Example").master("local")
      .config("spark.sql.warehouse.dir", warehouseLocation)
      .enableHiveSupport()
      .getOrCreate()

    import spark.implicits._
    import spark.sql

    sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING) USING hive")    sql("LOAD DATA LOCAL INPATH '/usr/local/spark3/examples/src/main/resources/kv1.txt' INTO TABLE src")    sql("SELECT * FROM src").show()    spark.stop()
  }

}

This is my sbt file

name := "SparkHive"version := "0.1"scalaVersion := "2.12.10"libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.5"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.5"
libraryDependencies += "mysql" % "mysql-connector-java" % "8.0.19"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.5"

How can solve this Issue? While observing the console i also saw this statement, is this the reason why i am getting this issue.

20/05/28 14:03:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(UDHAV.MAHATA); groups with view permissions: Set(); users  with modify permissions: Set(UDHAV.MAHATA); groups with modify permissions: Set()

Can anyone help me?

Thank You!

2 ACCEPTED SOLUTIONS

Accepted Solutions
Highlighted

Re: Exception while connecting hive and sparkThe root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x

Master Collaborator

@Udhav  You can solve the issue by making /tmp/hive writeable:

 

hdfs dfs -chmod 777 /tmp/hive

 

when complete you should see the correct permissions as follows (drwxrwxrwx):

 

[hdfs@hdp ~]$ hdfs dfs -chmod 777 /tmp/hive
[hdfs@hdp ~]$ hdfs dfs -ls /tmp | grep hive
drwxrwxrwx   - hive   hdfs            0 2020-04-14 13:33 /tmp

 

 

If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.  

 

Thanks,


Steven @ DFHZ

 


 


If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.  


 


Thanks,



Steven

View solution in original post

Highlighted

Re: Exception while connecting hive and sparkThe root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x

Master Collaborator

@Udhav  Yes.  Permissions very user specific.  Can you create separate scratch locations for the user?  They would need to be chown correctly to the user and writeable chmod 777.  Right now that /tmp/hive is hadoopuser:supergroup.

 


 


If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.  


 


Thanks,



Steven

View solution in original post

4 REPLIES 4
Highlighted

Re: Exception while connecting hive and sparkThe root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x

Master Collaborator

@Udhav  You can solve the issue by making /tmp/hive writeable:

 

hdfs dfs -chmod 777 /tmp/hive

 

when complete you should see the correct permissions as follows (drwxrwxrwx):

 

[hdfs@hdp ~]$ hdfs dfs -chmod 777 /tmp/hive
[hdfs@hdp ~]$ hdfs dfs -ls /tmp | grep hive
drwxrwxrwx   - hive   hdfs            0 2020-04-14 13:33 /tmp

 

 

If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.  

 

Thanks,


Steven @ DFHZ

 


 


If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.  


 


Thanks,



Steven

View solution in original post

Re: Exception while connecting hive and sparkThe root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x

Explorer

@stevenmatison 

I already tried doing that a few times but still i am getting the same exception.

rsz_screenshot_from_2020-05-28_17-56-05.png

Actually i have 2 users (A and B), My source code is running in user A while Hadoop and HIve are running in user B. I guess maybe thats why i am getting this issue, i am not sure. I also tried copying my intellij project to the other user, then i am not even able to run my code.

Thanks

Highlighted

Re: Exception while connecting hive and sparkThe root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x

Master Collaborator

@Udhav  Yes.  Permissions very user specific.  Can you create separate scratch locations for the user?  They would need to be chown correctly to the user and writeable chmod 777.  Right now that /tmp/hive is hadoopuser:supergroup.

 


 


If this answer resolves your issue or allows you to move forward, please choose to ACCEPT this solution and close this topic. If you have further dialogue on this topic please comment here or feel free to private message me. If you have new questions related to your Use Case please create separate topic and feel free to tag me in your post.  


 


Thanks,



Steven

View solution in original post

Highlighted

Re: Exception while connecting hive and sparkThe root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x

Explorer

Hey @stevenmatison, its working now thanks for the help.

Don't have an account?
Coming from Hortonworks? Activate your account here