Member since
11-01-2017
3
Posts
0
Kudos Received
0
Solutions
11-06-2017
04:36 PM
My response went inline above - not sure why. Included is an example of the submit that I am using.
... View more
11-06-2017
04:35 PM
I did copy the file to all of the nodes, but only because all of the nodes (containers) are based on an identical image. I have not tested pushing only to a single node. My launching now looks like: conf = SparkConf()
conf.set('spark.driver.extraClassPath', '/usr/local/hbase-1.2.6/lib/*')
conf.set('spark.executor.extraClassPath', '/usr/local/hbase-1.2.6/lib/*')
sc = SparkContext(master='spark://master:7077', conf=conf)
sqlcontext = SQLContext(sc) I am figuring this out as I work through my use case, so hopefully this works on your side as well.
... View more
11-01-2017
09:05 PM
I just got this working after seeing similar issues due to an inability to access the Zookeeper Quorum properly. I tried adding the hbase-site.xml file using the PYSPARK_SUBMIT_ARGS and also via a SparkConf object - no joy. What did work was to manually copy the hbase-site.xml file into $SPARK_HOME/conf. I just got this working inside of some docker containers and had to manually move the hbase-site.xml into $SPARK_HOME/conf. Adding it inside of the PYSPARK_SUBMIT_ARGS or via a SparkConf object was not working.
... View more