Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Livy Server Impersonation Problem

Livy Server Impersonation Problem

New Contributor

We try to git clone the livy server from https://github.com/cloudera/livy in order to start up livy server to make the hue notebook apps works
The livy server can be started and notebook apps run smooth if i dont enable: livy.impersonation.enabled = true

The error shows:
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

here are some logs before the error shows:
16/05/23 17:48:24 INFO LivyServer: Using spark-submit version 1.5.0-cdh5.6.0
16/05/23 17:48:24 WARN RequestLogHandler: !RequestLog
16/05/23 17:48:25 INFO WebServer: Starting server on http://techlab01-rh.example.com:8998
16/05/23 17:48:59 INFO InteractiveSession: Creating LivyClient for sessionId: 0
16/05/23 17:49:00 WARN RSCConf: Your hostname, techlab01-rh.example.com, resolves to a loopback address, but we couldn't find any external IP address!
16/05/23 17:49:00 WARN RSCConf: Set livy.local.rpc.server.address if you need to bind to another address.
16/05/23 17:49:00 INFO SessionManager: Registering new session 0
16/05/23 17:49:03 INFO ContextLauncher: 16/05/23 17:49:03 INFO RSCDriver: Connecting to: techlab01-rh.example.com:37410
16/05/23 17:49:03 INFO ContextLauncher: 16/05/23 17:49:03 INFO RSCDriver: Starting RPC server...
16/05/23 17:49:03 INFO ContextLauncher: 16/05/23 17:49:03 WARN RSCConf: Your hostname, techlab01-rh.example.com, resolves to a loopback address, but we couldn't find any external IP address!
16/05/23 17:49:03 INFO ContextLauncher: 16/05/23 17:49:03 WARN RSCConf: Set livy.local.rpc.server.address if you need to bind to another address.
16/05/23 17:49:04 INFO ContextLauncher: 16/05/23 17:49:04 INFO RSCDriver: Received job request 622603ab-bb78-4b69-bdd6-52c852151a68
16/05/23 17:49:04 INFO ContextLauncher: 16/05/23 17:49:04 INFO RSCDriver: SparkContext not yet up, queueing job request.
16/05/23 17:49:05 INFO ContextLauncher: 16/05/23 17:49:05 INFO SecurityManager: Changing view acls to: livy,luijo
16/05/23 17:49:05 INFO ContextLauncher: 16/05/23 17:49:05 INFO SecurityManager: Changing modify acls to: livy,luijo
16/05/23 17:49:05 INFO ContextLauncher: 16/05/23 17:49:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(livy, luijo); users with modify permissions: Set(livy, luijo)
16/05/23 17:49:05 INFO ContextLauncher: 16/05/23 17:49:05 INFO HttpServer: Starting HTTP Server
16/05/23 17:49:05 INFO ContextLauncher: 16/05/23 17:49:05 INFO Utils: Successfully started service 'HTTP class server' on port 39810.
16/05/23 17:49:09 INFO ContextLauncher: 16/05/23 17:49:09 INFO SparkContext: Running Spark version 1.5.0-cdh5.6.0
16/05/23 17:49:09 INFO ContextLauncher: 16/05/23 17:49:09 INFO SecurityManager: Changing view acls to: livy,luijo
16/05/23 17:49:09 INFO ContextLauncher: 16/05/23 17:49:09 INFO SecurityManager: Changing modify acls to: livy,luijo
16/05/23 17:49:09 INFO ContextLauncher: 16/05/23 17:49:09 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(livy, luijo); users with modify permissions: Set(livy, luijo)
16/05/23 17:49:09 INFO ContextLauncher: 16/05/23 17:49:09 INFO Slf4jLogger: Slf4jLogger started
16/05/23 17:49:09 INFO ContextLauncher: 16/05/23 17:49:09 INFO Remoting: Starting remoting
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@techlab01-rh.example.com:55944]
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriver@techlab01-rh.example.com:55944]
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO Utils: Successfully started service 'sparkDriver' on port 55944.
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkEnv: Registering MapOutputTracker
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkEnv: Registering BlockManagerMaster
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-8423761d-fbb5-49d1-ae2e-47e1339335e1
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO MemoryStore: MemoryStore started with capacity 530.3 MB
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO HttpFileServer: HTTP File server directory is /tmp/spark-76b9ae2d-18b3-4c6b-ba2f-3a14376b14f0/httpd-16b4811f-cb06-49a7-a7cc-79e7161c518d
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO HttpServer: Starting HTTP Server
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO Utils: Successfully started service 'HTTP file server' on port 57323.
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkEnv: Registering OutputCommitCoordinator
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO Utils: Successfully started service 'SparkUI' on port 4041.
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkUI: Started SparkUI athttp://techlab01-rh.example.com:4041
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkContext: Added JAR file:/home/livy/livy/rsc/target/jars/netty-all-4.0.23.Final.jar at http://techlab01-rh.example.com:57323/jars/netty-all-4.0.23.Final.jar with timestamp 1463996950838
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkContext: Added JAR file:/home/livy/livy/rsc/target/jars/livy-api-0.2.0-SNAPSHOT.jar at http://techlab01-rh.example.com:57323/jars/livy-api-0.2.0-SNAPSHOT.jar with timestamp 1463996950839
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkContext: Added JAR file:/home/livy/livy/rsc/target/jars/livy-rsc-0.2.0-SNAPSHOT.jar at http://techlab01-rh.example.com:57323/jars/livy-rsc-0.2.0-SNAPSHOT.jar with timestamp 1463996950841
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkContext: Added JAR file:/home/livy/livy/repl/target/jars/commons-codec-1.9.jar at http://techlab01-rh.example.com:57323/jars/commons-codec-1.9.jar with timestamp 1463996950842
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkContext: Added JAR file:/home/livy/livy/repl/target/jars/livy-repl-0.2.0-SNAPSHOT.jar at http://techlab01-rh.example.com:57323/jars/livy-repl-0.2.0-SNAPSHOT.jar with timestamp 1463996950843
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 INFO SparkContext: Added JAR file:/home/livy/livy/repl/target/jars/livy-core-0.2.0-SNAPSHOT.jar at http://techlab01-rh.example.com:57323/jars/livy-core-0.2.0-SNAPSHOT.jar with timestamp 1463996950843
16/05/23 17:49:10 INFO ContextLauncher: 16/05/23 17:49:10 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/05/23 17:49:11 INFO ContextLauncher: 16/05/23 17:49:11 INFO Client: Requesting a new application from cluster with 11 NodeManagers
16/05/23 17:49:11 INFO ContextLauncher: 16/05/23 17:49:11 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (3800 MB per container)
16/05/23 17:49:11 INFO ContextLauncher: 16/05/23 17:49:11 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
16/05/23 17:49:11 INFO ContextLauncher: 16/05/23 17:49:11 INFO Client: Setting up container launch context for our AM
16/05/23 17:49:11 INFO ContextLauncher: 16/05/23 17:49:11 INFO Client: Setting up the launch environment for our AM container
16/05/23 17:49:11 INFO ContextLauncher: 16/05/23 17:49:11 INFO Client: Preparing resources for our AM container
16/05/23 17:49:11 INFO ContextLauncher: 16/05/23 17:49:11 INFO YarnSparkHadoopUtil: getting token for namenode: hdfs://nameservice1/user/luijo/.sparkStaging/application_1463992580024_0002
16/05/23 17:49:11 INFO ContextLauncher: 16/05/23 17:49:11 INFO DFSClient: Created HDFS_DELEGATION_TOKEN token 51130 for luijo on ha-hdfs:nameservice1
16/05/23 17:49:12 INFO ContextLauncher: 16/05/23 17:49:12 INFO metastore: Trying to connect to metastore with URI thrift://techlab02-rh.example.com:9083
16/05/23 17:49:12 INFO ContextLauncher: 16/05/23 17:49:12 ERROR TSaslTransport: SASL negotiation failure
16/05/23 17:49:12 INFO ContextLauncher: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]

i was using luijo this account to login into hue and run the notebook apps
luijo@EXAMPLE.COM this principal can be found in the KDC as well

May I know if I miss something to the configuration?

Thanks,
Johnson

4 REPLIES 4

Re: Livy Server Impersonation Problem

Do you have a valid Kerberos ticket for your user?

kinit ...

Re: Livy Server Impersonation Problem

New Contributor

Hello Romainr,

 

Thanks for your reply and yes, i kinit to be livy@EXAMPLE.COM before starting the livy server

On the other hand, i have set those below in my core-site.xml
<property>
   <name>hadoop.proxyuser.livy.groups</name>
   <value>*</value>
</property>
<property>
   <name>hadoop.proxyuser.livy.hosts</name>
   <value>*</value>
</property>
<property>
 <name>hadoop.proxyuser.hue.hosts</name>
 <value>*</value>
</property>
<property>
 <name>hadoop.proxyuser.hue.groups</name>
 <value>*</value>
</property>
 

I have even tried to kinit hue/techlab01-rh.example.com@EXAMPLE.COM before starting the livy server
but both of the results are failed with the error i mentioned before
 
Thanks,
Johnson

Re: Livy Server Impersonation Problem

New Contributor

Hi Johnson

 

I know this is an old post, but did you find a solution to this problem?  I have the same issue.

 

Thanks

Paul

Re: Livy Server Impersonation Problem

New Contributor

    I got the same error:403 when using hue notebook through livy in a kerberos CDH-5.7.1 cluster.

    i kinit the ticket before running livy-server,i also do change the hue code like https://github.com/cloudera/hue/commit/25ed265ab719da51160f242ffe9a92cc6d8a4bec

    its still not work.

 

    hue-safe

[desktop]
app_blacklist=sqoop,zookeeper,hbase
use_new_editor=true
[spark]
security_enabled=true
livy_server_host=192.168.103.166
livy_server_port=8998
livy_server_session_kind=yarn
livy_impersonation_enabled=true
languages='[{"name": "Scala Shell", "type": "spark"},{"name": "PySpark Shell", "type": "pyspark"},{"name": "R Shell", "type": "r"},{"name": "Jar", "type": "Jar"},{"name": "Python", "type": "py"},{"name": "Impala SQL", "type": "impala"},{"name": "Hive SQL", "type": "hive"},{"name": "Text", "type": "text"}]'

[notebook]
show_notebooks=true
enable_batch_execute=true
enable_query_builder=true
enable_query_scheduling=false
 [[interpreters]]
	[[[hive]]]
      name=Hive
      interface=hiveserver2
	[[[impala]]]
      name=Impala
      interface=hiveserver2
	[[[spark]]]
      name=Scala
      interface=livy
	[[[pyspark]]]
      name=PySpark
      interface=livy
	[[[jar]]]
      name=Spark Submit Jar
      interface=livy-batch
    [[[py]]]
      name=Spark Submit Python
      interface=livy-batch

    livy-conf

livy.spark.master=yarn
livy.spark.deploy-mode=client
livy.superusers=hue
livy.server.auth.type=kerberos
livy.server.auth.kerberos.keytab=/etc/security/keytabs/spnego166.keytab
livy.server.auth.kerberos.principal=HTTP/bigdata166.xxxx.com@xxxx.COM
livy.server.launch.kerberos.keytab=/etc/security/keytabs/livy.keytab
livy.server.launch.kerberos.principal=livy/bigdata166.xxxx.com@xxxx.COM

    livy-log

19/04/10 12:56:29 INFO AccessManager: AccessControlManager acls disabled;users with view permission: ;users with modify permission: ;users with super permission: hue;other allowed users: *
19/04/10 12:56:30 INFO LineBufferedStream: stdout: Welcome to
19/04/10 12:56:30 INFO LineBufferedStream: stdout:       ____              __
19/04/10 12:56:30 INFO LineBufferedStream: stdout:      / __/__  ___ _____/ /__
19/04/10 12:56:30 INFO LineBufferedStream: stdout:     _\ \/ _ \/ _ `/ __/  '_/
19/04/10 12:56:30 INFO LineBufferedStream: stdout:    /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
19/04/10 12:56:30 INFO LineBufferedStream: stdout:       /_/
19/04/10 12:56:30 INFO LineBufferedStream: stdout:                         
19/04/10 12:56:30 INFO LineBufferedStream: stdout: Type --help for more information.
19/04/10 12:56:30 INFO StateStore$: Using BlackholeStateStore for recovery.
19/04/10 12:56:30 INFO BatchSessionManager: Recovered 0 batch sessions. Next session id: 0
19/04/10 12:56:30 INFO InteractiveSessionManager: Recovered 0 interactive sessions. Next session id: 0
19/04/10 12:56:30 INFO InteractiveSessionManager: Heartbeat watchdog thread started.
19/04/10 12:56:30 INFO LivyServer: SPNEGO auth enabled (principal = HTTP/bigdata166.xxxx.com@xxxx.COM)
19/04/10 12:56:30 INFO KerberosAuthenticationHandler: Login using keytab /etc/security/keytabs/spnego166.keytab, for principal HTTP/bigdata166.xxxx.com@xxxx.COM
19/04/10 12:56:30 INFO WebServer: Starting server on http://bigdata166.xxxx.com:8998
19/04/10 12:58:21 WARN AuthenticationFilter: Authentication exception: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos credentails)
Don't have an account?
Coming from Hortonworks? Activate your account here