Support Questions
Find answers, ask questions, and share your expertise

Scala with Hive in Ecplipse Scala

New Contributor

Hi Guys,

I'm working with Scala + Hive, but I'm developing my code in Eclipse, so I'm using HiveContext, but it's only working when I move the code to sandbox, there is way to test HiveContext connecting in my sandbox from my local machine ?

My Code:

    // Set context
    val sparkConf = new SparkConf().setAppName("app-name").setMaster("local")
    val sc = new SparkContext(sparkConf)
    val sqlContext = new SQLContext(sc)
    val hiveContext = new HiveContext(sc)
    System.out.println("INFO: ****************** Starting Connection HIVE ******************")
    hiveContext.sql("use mydatabase")
    hiveContext.sql("select * from mytable").show(10)
    System.out.println("INFO: ****************** Connected in HIVE ******************")
    System.out.println("INFO: ****************** End Test Scala ******************")
	
3 REPLIES 3

Super Guru

You need to have hive-site.xml for Hive Context.

4. Create hive-site in the Spark conf directory:

As user root, create the file SPARK_HOME/conf/hive-site.xml. Edit the file to contain only the following configuration setting:

<configuration><property><name>hive.metastore.uris</name><!--Make sure that <value> points to the Hive Metastore URI in your cluster --><value>thrift://sandbox.hortonworks.com:9083</value><description>URI for client to contact metastore server</description></property></configuration>


See:   http://hortonworks.com/hadoop-tutorial/a-lap-around-apache-spark/

New Contributor

I'm working with Eclipse in a Windows box, actually I'm trying to run (only for tests) my code in Windows box connecting in my sandbox.

Super Guru

make sure you're firewall is not blocking it

can you connect to your sandbox from command-line or from browser? You are running local Spark. Local spark is looking for a connected Hadoop. You have to specify Hadoop configuration as mentioned and have environment variables pointing to the correct place.

what version of Spark? Did you download a binary version or build it?

What version of Scala? JDK?

Setup

Make sure you have the Java 7 or Java 8 SDK installed for Windows. Then you’ll need to download Scala 2.10.x, SBT and then Spark.

Solving Local Spark Issues

set SPARK_MASTER_IP=127.0.0.1 set SPARK_LOCAL_IP=127.0.0.1 set SCALA_HOME=C://windoze/scala-2.10.6 set PATH=%PATH%;%SCALA_HOME%/bin