Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HUE Spark notebook

avatar
New Contributor

Hi guys,

 

I've upgrade from CDH 5.5.x to 5.7.x and after this upgrade I can't see "Spark Notebook" in Hue interface anymore (specifically Query Editors -> Spark Notebook). My Advanced configuration snippet contains following and except the spark issue it works as it should:

 

[desktop]
app_blacklist=hbase,search,indexer,security

 

And yet there's no link in Hue UI. Although I am able to access the Spark Notebook if I enter the address directly https://hue.example.com/notebook/. Any recommendations? Thanks.

1 ACCEPTED SOLUTION

avatar
Super Guru
Ha, in 5.7 it was actually

[beeswax]
use_new_editor=true

View solution in original post

7 REPLIES 7

avatar
Super Guru
In the advanced configuration, could you add

[notebook
show_notebooks=true

https://github.com/cloudera/hue/blob/master/desktop/conf.dist/hue.ini#L577


We are currently working on merging both the editor and the notebook, it
will be all unified in Hue 4

avatar
New Contributor

Hi,

 

Thank you for your answer. I've tested proposed solution - currently my hue_safety_valve.ini contains:

 

[notebook]
show_notebooks=true
[desktop]
app_blacklist=hbase,search,indexer,security
[[database]]
engine=postgresql_psycopg2
name=hue
host=localhost
port=5552
user=hue
password=<<somepass>>

but the result is the same as before - it doesn't work :/.

avatar
Super Guru
Ha, in 5.7 it was actually

[beeswax]
use_new_editor=true

avatar

That did not work for us for Hue 3.9.0. We used

[desktop]

use_new_editor=true

avatar
Master Collaborator

Hi Guys,

 

is there is an  alternative to the --jars option of spark-submit in the spark notebook in Hue?

avatar
New Contributor

Hello guys,

I'm using Cloudera 5.12.1 and Hue 4.0.

 

I have all settings by default in Hue and I'm able to see: Impala, Hive, Pig, Java, Spark, MapReduce, Shell, Sqoop1, and Distcp but unfortunately, I cannot see: Notebook, Spark, pySpark, ...

 

Please, any idea how to fix that.

 

I already checked the configuration file in location (by default configuration) /var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVE and hue.ini file looks like:

*****************

[desktop]
secret_key_script=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/altscript.sh sec-1-secret_key
http_host=vm1.solutia.home
cluster_id=507d3cd0-e317-4147-a69f-a2b43854ec92
hue_load_balancer=http://vm1.solutia.home:8889
app_blacklist=spark,zookeeper,security
http_port=8888
time_zone=America/Los_Angeles
django_debug_mode=false
http_500_debug_mode=false
cherrypy_server_threads=50
default_site_encoding=utf
collect_usage=true
audit_event_log_dir=/var/log/hue/audit/hue-HUE_SERVER-0af71b69575a3278e45a18d9c57aa712
audit_log_max_file_size=100MB
[[metrics]]
location=/var/log/hue/metrics-hue_server/metrics.log
collection_interval=30000
[[custom]]
[[auth]]
idle_session_timeout=-1
user_augmentor=desktop.auth.backend.DefaultUserAugmentor
backend=desktop.auth.backend.AllowFirstUserDjangoBackend
[[database]]
engine=postgres
host=vm1.solutia.home
port=7432
user=hue1
password_script=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/altscript.sh sec-5-password
name=hue1
[[smtp]]
host=localhost
port=25
user=
password=
tls=no
[[kerberos]]
[metadata]
[[navigator]]
navmetadataserver_auth_type=CMDB
navmetadataserver_cmdb_user=__cloudera_internal_user__hue-HUE_SERVER-0af71b69575a3278e45a18d9c57aa712
navmetadataserver_cmdb_password_script=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/altscript.sh sec-9-navmetadataserver_cmdb_password
[hadoop]
[[hdfs_clusters]]
[[[default]]]
fs_defaultfs=hdfs://hacluster
webhdfs_url=http://vm2.solutia.home:14000/webhdfs/v1
hadoop_hdfs_home=/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop-hdfs

hadoop_bin=/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/bin/hadoop
hadoop_conf_dir=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/yarn-conf
security_enabled=false
temp_dir=/tmp
[[yarn_clusters]]
[[[default]]]
resourcemanager_host=vm1.solutia.home
resourcemanager_api_url=http://vm1.solutia.home:8088/
proxy_api_url=http://vm1.solutia.home:8088/
resourcemanager_port=8032
logical_name=yarnRM
history_server_api_url=http://vm1.solutia.home:19888/
security_enabled=false
submit_to=true
hadoop_mapred_home=/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop-mapreduce
hadoop_bin=/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/bin/hadoop
hadoop_conf_dir=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/yarn-conf
[[[ha]]]
resourcemanager_host=vm2.solutia.home
resourcemanager_api_url=http://vm2.solutia.home:8088/
proxy_api_url=http://vm2.solutia.home:8088/
resourcemanager_port=8032
logical_name=yarnRM
history_server_api_url=http://vm1.solutia.home:19888/
security_enabled=false
submit_to=true
hadoop_mapred_home=/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop-mapreduce
hadoop_bin=/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hadoop/bin/hadoop
hadoop_conf_dir=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/yarn-conf
[beeswax]
hive_server_host=vm1.solutia.home
hive_server_port=10000
server_conn_timeout=120
hive_conf_dir=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/hive-conf
[liboozie]
remote_data_dir=/user/hue/jobsub
oozie_url=http://vm1.solutia.home:11000/oozie
security_enabled=false
[useradmin]
[impala]
server_host=vm1.solutia.home
server_port=21050

server_conn_timeout=120
[sqoop]
server_url=http://vm2.solutia.home:12000/sqoop
sqoop_conf_dir=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/sqoop2-conf
[search]
solr_url=http://vm1.solutia.home:8983/solr
[hbase]
hbase_clusters=(HBase|vm3.solutia.home:9090)
hbase_conf_dir=/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/hbase-conf
[proxy]
whitelist=(localhost|127\.0\.0\.1):(50030|50070|50060|50075)
[shell]
[[ shelltypes ]]
[[[ pig ]]]
nice_name="Pig Shell (Grunt)"
command="/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/pig/../../bin/pig -l /dev/null"
help="The command-line interpreter for Pig"
[[[[ environment ]]]]
[[[[[ JAVA_HOME ]]]]]
value="/usr/java/jdk1.7.0_67-cloudera"
[[[[[ HADOOP_CONF_DIR ]]]]]
value="/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/yarn-conf"
[[[ hbase ]]]
nice_name="HBase Shell"
command="/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/hbase/bin/hbase shell"
help="The command-line HBase client interface."
[[[[ environment ]]]]
[[[[[ JAVA_HOME ]]]]]
value="/usr/java/jdk1.7.0_67-cloudera"
[[[[[ HADOOP_CONF_DIR ]]]]]
value="/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/yarn-conf"
[[[[[ HBASE_CONF_DIR ]]]]]
value="/var/run/cloudera-scm-agent/process/1051-hue-HUE_SERVER/hbase-conf"
[[[ sqoop2 ]]]
nice_name="Sqoop2 Shell"
command="/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/lib/sqoop2/../../bin/sqoop2"
help="The command-line Sqoop2 client"
[[[[ environment ]]]]
[[[[[ JAVA_HOME ]]]]]
value="/usr/java/jdk1.7.0_67-cloudera"
[zookeeper]
[[clusters]]
[[[default]]]
host_ports=vm2.solutia.home:2181,vm3.solutia.home:2181,vm1.solutia.home:2181
[libzookeeper]
ensemble=vm2.solutia.home:2181,vm3.solutia.home:2181,vm1.solutia.home:2181

 

 

Thank you for your help.

Martin

avatar
Expert Contributor

I think you are missing this which it was mentioned here:

 

[desktop]
use_new_editor=true

 

Hope it helps