28804
DISCUSSIONS
102185
MEMBERS
3161
ARTICLES
Created 04-30-2015 08:39 AM
When I fire up spark-shell under the latest parcels I get this:
. . .
15/04/30 15:03:02 INFO SparkILoop: Created spark context..
Spark context available as sc.
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
at java.lang.Class.getDeclaredConstructors0(Native Method)
. . .
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
scala>
Is it trying to execute a set of commands from some config file? Where would those come from? Do others have this issue? I don't recall setting something up myself and I don't care about having a HiveContext for now. Thanks for any suggestions. As far as I can tell these errors are benign because I can get a SQLContext, etc. manually, but it would be nice to start cleanly.