Recently i installed cloudera quickstart vm 5.8 on my windows machine on top of VMware. By default Spark UI link and Zookeeper link was not there on Hue so, i just edited the hue.ini which had,
app_blacklist = zookeeper, spark
After doing this i was able to download some Spark examples but the Spark UI link was still not displayed. However i was able to get the zookeeper UI link.
From the downloaded examples i selected sample notebook through which i was able to get the Spark notebook UI. It had some examples but when i run them i'm getting the following error.
HTTPConnectionPool(host='localhost', port=8998): Max retries exceeded with url: /sessions (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fdd7c5f2e50>: Failed to establish a new connection: [Errno 111] Connection refused',))
Do i need to make any changes in addition to the one i have done in hue.ini file. Please guide me through this.
The error was rectified after i installed livy server and ran it using the following commands;
export SPARK_HOME=/usr/lib/spark export HADOOP_CONF_DIR=/etc/hadoop/conf $HOME/livy-server-0.2.0/bin/livy-server
I am also having a similar issue with the Quickstart VM (CDH 5.8). However, I have started livy server. I can successfully create a new notebook, that has one simple pyspark command: Print 1 + 1 + 1. However, when I save this notebook and re-open it. I get the error: "Session '2' not found." (error 404)". Simillary, if I try to open the Sample Notebook and run the first command: Print 1 + 1 + 1, I get the error "Session '-1' not found." (error 404). As I move from notebook to notebook, should old sessions automatically be deleted and new session automatically created? Am I missing something here ?
Issue solved. Click on the gear's icon, shows a dialog of Sessions for each language in the notebook. Click the Recreate icon will recreate the session for that language. You need to do manually do this for all languages in the notebook. NOTE Using the Sample Notebook, I could not recreate the session for Scala. I got an the following error: The Spark Session could not be created in the cluster: timeout. I could also not recreate the session for R language. I got the error: Requirement failed: sparkr.zip not found; annot run sparkr application (error 400). Perhaps some additional configuration is needed for the Quickstart VM for R ???.