Member since
12-25-2018
4
Posts
1
Kudos Received
0
Solutions
02-26-2019
02:12 PM
1 Kudo
Hi! We use hive with llap, so "run as end user" = false. Impersonalization enabled for livy interpeter. We also use Ranger to manage permissions. Services / Spark2 / Configs Custom livy2-conf livy.file.local-dir-whitelist = /usr/hdp/current/hive_warehouse_connector/
livy.spark.security.credentials.hiveserver2.enabled = true
livy.spark.sql.hive.hiveserver2.jdbc.url = jdbc:hive2://dwh-test-hdp-master03.COMPANY.ru:10000/
livy.spark.sql.hive.hiveserver2.jdbc.url.principal = hive/_HOST@COMPANY.RU
livy.spark.yarn.security.credentials.hiveserver2.enabled = true
livy.superusers = zeppelin-dwh_test Custom spark2-defaults spark.datasource.hive.warehouse.load.staging.dir = /tmp
spark.datasource.hive.warehouse.metastoreUri = thrift://dwh-test-hdp-master03.COMPANY.ru:9083
spark.hadoop.hive.llap.daemon.service.hosts = @llap0
spark.hadoop.hive.zookeeper.quorum = dwh-test-hdp-master01.COMPANY.ru:2181,dwh-test-hdp-master02.COMPANY.ru:2181,dwh-test-hdp-master03.COMPANY.ru:2181
spark.history.ui.admin.acls = knox
spark.security.credentials.hive.enabled = true
spark.security.credentials.hiveserver2.enabled = true
spark.sql.hive.hiveserver2.jdbc.url = jdbc:hive2://dwh-test-hdp-master03.COMPANY.ru:10000/
spark.sql.hive.hiveserver2.jdbc.url.principal = hive/_HOST@COMPANY.RU
spark.sql.hive.llap = true
spark.yarn.security.credentials.hiveserver2.enabled = true Custom spark2-hive-site-override hive.llap.daemon.service.hosts = @llap0 / Services / HDFS / Configs You may also set this these values to asterisk for test if problem in delegation. Custom core-site hadoop.proxyuser.hive.groups *
hadoop.proxyuser.hive.hosts *
hadoop.proxyuser.livy.groups *
hadoop.proxyuser.livy.hosts *
hadoop.proxyuser.zeppelin.hosts *
hadoop.proxyuser.zeppelin.groups * Zeppelin livy2 %livy2 Interpreter Properties name value livy.spark.hadoop.hive.llap.daemon.service.hosts @llap0
livy.spark.jars file:/usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.1.0.0-78.jar
livy.spark.security.credentials.hiveserver2.enabled true
livy.spark.sql.hive.hiveserver2.jdbc.url jdbc:hive2://dwh-test-hdp-master03.COMPANY.ru:10000/
livy.spark.sql.hive.hiveserver2.jdbc.url.principal hive/_HOST@COMPANY.RU
livy.spark.sql.hive.llap true
livy.spark.submit.pyFiles file:/usr/hdp/current/hive_warehouse_connector/pyspark_hwc-1.0.0.3.1.0.0-78.zip
livy.spark.yarn.security.credentials.hiveserver2.enabled true
livy.superusers livy,zeppelin
spark.security.credentials.hiveserver2.enabled true
spark.sql.hive.hiveserver2.jdbc.url.principal hive/_HOST@COMPANY.RU
zeppelin.livy.concurrentSQL false
zeppelin.livy.displayAppInfo true
zeppelin.livy.keytab /etc/security/keytabs/zeppelin.server.kerberos.keytab
zeppelin.livy.maxLogLines 1000
zeppelin.livy.principal zeppelin-dwh_test@COMPANY.RU
zeppelin.livy.pull_status.interval.millis 1000
zeppelin.livy.restart_dead_session false
zeppelin.livy.session.create_timeout 120
zeppelin.livy.spark.sql.field.truncate true
zeppelin.livy.spark.sql.maxResult 1000
zeppelin.livy.url http://dwh-test-hdp-master02.COMPANY.ru:8999 Sample code for test: %livy2
import com.hortonworks.hwc.HiveWarehouseSession
import com.hortonworks.hwc.HiveWarehouseSession._
val hive = HiveWarehouseSession.session(spark).build()
hive.showDatabases().show(100) Ranger audit example:
... View more