Member since
09-14-2015
41
Posts
16
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1512 | 07-11-2017 05:38 AM | |
1188 | 01-11-2017 05:38 PM | |
1271 | 09-07-2016 06:45 PM | |
1591 | 09-07-2016 06:00 PM | |
2293 | 09-06-2016 09:03 AM |
05-29-2020
10:11 AM
Hi, has anyone get this running and can post an running example ? thx marcel
... View more
07-24-2017
02:09 AM
Hello, I did as directed above val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc) sqlContext.setConf("yarn.timeline-service.enabled","false") now I have ended up with a new issue which is very similar to earlier one, but now its trying to cast from yarn client to yarn client,I dont see spark's assembly jar in error . let me know if any configuration changes are required . I am trying to run in cluster mode hdp 2.4.2.0-258 diagnostics: User class threw exception: java.lang.LinkageError:
ClassCastException: attempting to
castjar:file:/hadoop/hadoop/yarn/local/filecache/10020/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.classtojar:file:/hadoop/hadoop/yarn/local/filecache/10020/spark-hdp-assembly.jar!/javax/ws/rs/ext/RuntimeDelegate.class Exception in thread "main" org.apache.spark.SparkException: Application
application_1499441914050_13463 finished with failed status
at
org.apache.spark.deploy.yarn.Client.run(Client.scala:1092)
at
org.apache.spark.deploy.yarn.Client$.main(Client.scala:1139)
at
org.apache.spark.deploy.yarn.Client.main(Client.scala)
at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:731)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
17/07/13
12:38:12 INFO ShutdownHookManager: Shutdown hook
called
... View more
01-26-2017
09:42 PM
4 Kudos
Livy: Livy is an open source REST interface for interacting with Spark. Authorized users can launch a Spark session and submit code. Two different users can access their own private data and session, and they can collaborate on a notebook. Only the Livy server can submit a job securely to a Spark session. Steps to follow to configure livy interpreter to work with secure HDP cluster: Setup proxy for livy interpreter in core-site.xml Go to Ambari->HDFS->config->customer-core-site and add below properties:
hadoop.proxyuser.livy.groups=*
hadoop.proxyuser.livy.hosts=*
2. Configure livy interpreter in Zeppelin and add below configurations: livy.superusers=zeppelin-spark
Note - The value for livy.superusers should be your zeppelin principal. That would be zeppelin-{$Cluster_name} For example, in this case you can find it by running below command:
klist -kt /etc/security/keytabs/zeppelin.server.kerberos.keytab
Keytab name: FILE:/etc/security/keytabs/zeppelin.server.kerberos.keytab
KVNO Timestamp Principal
---- ----------------- --------------------------------------------------------
1 11/15/16 17:33:16 zeppelin-spark@HWX.COM
1 11/15/16 17:33:16 zeppelin-spark@HWX.COM
1 11/15/16 17:33:16 zeppelin-spark@HWX.COM
1 11/15/16 17:33:16 zeppelin-spark@HWX.COM
1 11/15/16 17:33:16 zeppelin-spark@HWX.COM
zeppelin-spark will be your superuser for livy interpreter. *Make sure this will match with livy.superusers in livy-conf file. livy.impersonation.enabled=true //this configuration should also be present in livy-conf.
livy.server.access_control.enabled=true
livy.server.access_control.users=livy,zeppelin
livy.server.auth.type=kerberos
livy.server.auth.kerberos.keytab=/etc/security/keytabs/spnego.service.keytab
livy.server.auth.kerberos.principal=HTTP/spark-1.hwx.com@HWX.COM
livy.server.launch.kerberos.keytab=/etc/security/keytabs/livy.service.keytab
livy.server.launch.kerberos.principal=livy/spark-1.hwx.com@HWX.COM
Note - To configure Zeppelin with authentication for Livy you need to set the following in the interpreter settings: zeppelin.livy.principal=zeppelin-spark@HWX.COM
zeppelin.livy.keytab=/etc/security/keytabs/zeppelin.service.keytab
3. Make sure zeppelin.livy.url is pointing to hostname not IP address : zeppelin.livy.url=http://spark-3.hwx.com:8998 4. After saving configuration changes in livy interpreter, Please restart interpreter to see the affect.
... View more
Labels: