Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error while executing sql in spark-shell

Error while executing sql in spark-shell

New Contributor

Hi,

In my local machine I have setup the hadoop, hive, spark and scala this week and would want to execute the query from spark-shell. The following statements goes fine until the sql statement.

scala> import org.apache.spark.sql.hive.HiveContext

scala> import org.apache.spark.sql.SQLContext scala> val hiveCtx = new HiveContext(sc) scala> import hiveCtx._

scala> var rec1 = hiveCtx.sql(""" SELECT * from bkfs.tbl1 LIMIT 5 """)

scala> rec1=hiveCtx.sql("""SELECT * FROM BKFS.ASMT_02013 LIMIT 1""") scala.reflect.internal.Types$TypeError: bad symbolic reference. A signature in HiveMetastoreCatalog.class refers to term cache in package com.google.common which is not available. It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling HiveMetastoreCatalog.class. That entry seems to have slain the compiler. Shall I replay your session? I can re-run each line except the last one. [y/n]Replaying: import org.apache.spark.sql.hive.HiveContext error: while compiling: <console> during phase: jvm library version: version 2.10.4

[y/n]Replaying: import org.apache.spark.sql.hive.HiveContext error: while compiling: <console> during phase: jvm library version: version 2.10.4 compiler version: version 2.10.4 reconstructed args: last tree to typer: Apply(constructor $read) symbol: constructor $read in class $read (flags: <method> <triedcooking>) symbol definition: def <init>(): $line28.$read tpe: $line28.$read symbol owners: constructor $read -> class $read -> package $line28 context owners: class iwC -> package $line28 == Enclosing template or block == Template( // val <local $iwC>: <notype>, tree.tpe=$line28.iwC "java.lang.Object", "scala.Serializable" // parents ValDef( private "_" <tpt> <empty> )

== Expanded type of tree == TypeRef(TypeSymbol(class $read extends Serializable)) uncaught exception during compilation: java.lang.AssertionError Stopping spark context. <console>:8: error: not found: value sc sc.stop() ^ Exception in thread "main" java.lang.AssertionError: assertion failed: Tried to find '$line28' in '/tmp/spark-e0d2a57e-c03e-467a-92b2-c560baaa9c34' but it is not a directory

Could you suggest what might be the issue. Thanks!!!

2 REPLIES 2

Re: Error while executing sql in spark-shell

Rising Star

why are you creating a new HiveContext in your shell instead of using the sqlContext already available?

Re: Error while executing sql in spark-shell

@Revathy Mourouguessane

What error you get if you can only execute these steps?

import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.SQLContext 
val hiveCtx = new HiveContext(sc)
var rec1 = hiveCtx.sql("SELECT * from bkfs.tbl1 LIMIT 5")
var rec2 = hiveCtx.sql("SELECT * FROM BKFS.ASMT_02013 LIMIT 1") 
Don't have an account?
Coming from Hortonworks? Activate your account here