more livy-livy-server.out 18/06/25 07:44:38 WARN LivyConf: The configuration key livy.repl.enableHiveContext has been deprecated as of Livy 0.4 and may be removed in the future. Please use the new key livy.repl.enable-hive-context instea d. 18/06/25 07:44:38 WARN LivyConf: The configuration key livy.server.csrf_protection.enabled has been deprecated as of Livy 0.4 and may be removed in the future. Please use the new key livy.server.csrf-protection. enabled instead. 18/06/25 07:44:38 INFO AccessManager: AccessControlManager acls disabled;users with view permission: ;users with modify permission: ;users with super permission: ;other allowed users: * 18/06/25 07:44:39 INFO LineBufferedStream: stdout: Welcome to 18/06/25 07:44:39 INFO LineBufferedStream: stdout: ____ __ 18/06/25 07:44:39 INFO LineBufferedStream: stdout: / __/__ ___ _____/ /__ 18/06/25 07:44:39 INFO LineBufferedStream: stdout: _\ \/ _ \/ _ `/ __/ '_/ 18/06/25 07:44:39 INFO LineBufferedStream: stdout: /___/ .__/\_,_/_/ /_/\_\ version 2.2.0.2.6.3.0-235 18/06/25 07:44:39 INFO LineBufferedStream: stdout: /_/ 18/06/25 07:44:39 INFO LineBufferedStream: stdout: 18/06/25 07:44:39 INFO LineBufferedStream: stdout: Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_112 18/06/25 07:44:39 INFO LineBufferedStream: stdout: Branch HEAD 18/06/25 07:44:39 INFO LineBufferedStream: stdout: Compiled by user jenkins on 2017-10-30T02:50:38Z 18/06/25 07:44:39 INFO LineBufferedStream: stdout: Revision 4842c452e9a0ae6543f3749aec7bff92e82434e1 18/06/25 07:44:39 INFO LineBufferedStream: stdout: Url git@github.com:hortonworks/spark2.git 18/06/25 07:44:39 INFO LineBufferedStream: stdout: Type --help for more information. 18/06/25 07:44:39 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2 .annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]) 18/06/25 07:44:39 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2 .annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]) 18/06/25 07:44:39 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.an notation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[GetGroups]) 18/06/25 07:44:39 DEBUG MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.a pache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Renewal failures since startup]) 18/06/25 07:44:39 DEBUG MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache. hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Renewal failures since last successful login]) 18/06/25 07:44:39 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics 18/06/25 07:44:40 DEBUG SecurityUtil: Setting hadoop.security.token.service.use_ip to true 18/06/25 07:44:40 DEBUG Shell: Failed to detect a valid hadoop home directory java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:429) at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:400) at org.apache.hadoop.util.Shell.(Shell.java:477) at org.apache.hadoop.util.StringUtils.(StringUtils.java:79) at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1458) at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:96) at org.apache.hadoop.security.SecurityUtil.(SecurityUtil.java:80) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:303) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:291) at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:378) at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:372) at org.apache.livy.server.LivyServer.start(LivyServer.scala:93) at org.apache.livy.server.LivyServer$.main(LivyServer.scala:339) at org.apache.livy.server.LivyServer.main(LivyServer.scala) 18/06/25 07:44:40 DEBUG Shell: setsid exited with exit code 0 18/06/25 07:44:40 DEBUG KerberosName: Kerberos krb5 configuration not found, setting default realm to empty 18/06/25 07:44:40 DEBUG Groups: Creating new Groups object 18/06/25 07:44:40 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library... 18/06/25 07:44:40 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path 18/06/25 07:44:40 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 18/06/25 07:44:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 18/06/25 07:44:40 DEBUG PerformanceAdvisory: Falling back to shell based 18/06/25 07:44:40 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping 18/06/25 07:44:40 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000 18/06/25 07:44:40 DEBUG AbstractService: Service: org.apache.hadoop.yarn.client.api.impl.YarnClientImpl entered state INITED 18/06/25 07:44:40 DEBUG RMProxy: Created non-failover retry policy: RetryUpToMaximumCountWithFixedSleep(maxRetries=2147483647, sleepTime=15000 MILLISECONDS) 18/06/25 07:44:40 INFO RMProxy: Connecting to ResourceManager at lrandom_host.net/40.133.133.214:8050 18/06/25 07:44:40 DEBUG UserGroupInformation: hadoop login 18/06/25 07:44:40 DEBUG UserGroupInformation: hadoop login commit 18/06/25 07:44:40 DEBUG UserGroupInformation: using local user:UnixPrincipal: livy 18/06/25 07:44:40 DEBUG UserGroupInformation: Using user: "UnixPrincipal: livy" with name livy 18/06/25 07:44:40 DEBUG UserGroupInformation: User entry: "livy" 18/06/25 07:44:40 DEBUG UserGroupInformation: Assuming keytab is managed externally since logged in from subject. 18/06/25 07:44:40 DEBUG UserGroupInformation: UGI loginUser:livy (auth:SIMPLE) 18/06/25 07:44:40 DEBUG UserGroupInformation: PrivilegedAction as:livy (auth:SIMPLE) from:org.apache.hadoop.yarn.client.RMProxy.getProxy(RMProxy.java:163) 18/06/25 07:44:40 DEBUG YarnRPC: Creating YarnRPC for org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC 18/06/25 07:44:40 DEBUG HadoopYarnProtoRPC: Creating a HadoopYarnProtoRpc proxy for protocol interface org.apache.hadoop.yarn.api.ApplicationClientProtocol 18/06/25 07:44:40 DEBUG Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$Prot oBufRpcInvoker@12587fdc 18/06/25 07:44:40 DEBUG Client: getting client out of cache: org.apache.hadoop.ipc.Client@49a81f5 18/06/25 07:44:40 DEBUG UserGroupInformation: PrivilegedAction as:livy (auth:SIMPLE) from:org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:333) 18/06/25 07:44:40 DEBUG AbstractService: Service org.apache.hadoop.yarn.client.api.impl.YarnClientImpl is started 18/06/25 07:44:40 DEBUG BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 18/06/25 07:44:40 DEBUG BlockReaderLocal: dfs.client.read.shortcircuit = true 18/06/25 07:44:40 DEBUG BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 18/06/25 07:44:40 DEBUG BlockReaderLocal: dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket 18/06/25 07:44:40 DEBUG HAUtil: No HA service delegation token found for logical URI hdfs://ltrkarkdev 18/06/25 07:44:40 DEBUG BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 18/06/25 07:44:40 DEBUG BlockReaderLocal: dfs.client.read.shortcircuit = true 18/06/25 07:44:40 DEBUG BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 18/06/25 07:44:40 DEBUG BlockReaderLocal: dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket 18/06/25 07:44:40 DEBUG RetryUtils: multipleLinearRandomRetry = null 18/06/25 07:44:40 DEBUG Client: getting client out of cache: org.apache.hadoop.ipc.Client@49a81f5 18/06/25 07:44:41 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 18/06/25 07:44:41 DEBUG DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 18/06/25 07:44:41 DEBUG DFSClient: /livy2-recovery: masked=rwx------ 18/06/25 07:44:41 DEBUG Client: The ping interval is 60000 ms. 18/06/25 07:44:41 DEBUG Client: Connecting to random_host.net:8020 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy: starting, having connections 1 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy sending #0 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy got value #0 18/06/25 07:44:41 DEBUG ProtobufRpcEngine: Call: mkdirs took 80ms 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy sending #1 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy got value #1 18/06/25 07:44:41 DEBUG ProtobufRpcEngine: Call: getFileInfo took 1ms 18/06/25 07:44:41 INFO StateStore$: Using FileSystemStateStore for recovery. 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy sending #2 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy got value #2 18/06/25 07:44:41 DEBUG ProtobufRpcEngine: Call: getServerDefaults took 2ms 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy sending #3 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy got value #3 18/06/25 07:44:41 DEBUG RetryInvocationHandler: Exception while invoking ClientNamenodeProtocolTranslatorPB.getBlockLocations over random_host.net:8020. Not retrying because try o nce and fail. org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /livy2-recovery/v1/batch/state at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:2025) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1996) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1909) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:700) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:377) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554) at org.apache.hadoop.ipc.Client.call(Client.java:1498) at org.apache.hadoop.ipc.Client.call(Client.java:1398) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at com.sun.proxy.$Proxy13.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:272) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185) at com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1238) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1225) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1213) at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:309) at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:274) at org.apache.hadoop.hdfs.DFSInputStream.(DFSInputStream.java:266) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1538) at org.apache.hadoop.fs.Hdfs.open(Hdfs.java:317) at org.apache.hadoop.fs.Hdfs.open(Hdfs.java:60) at org.apache.hadoop.fs.AbstractFileSystem.open(AbstractFileSystem.java:629) at org.apache.hadoop.fs.FileContext$6.next(FileContext.java:797) at org.apache.hadoop.fs.FileContext$6.next(FileContext.java:793) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.open(FileContext.java:799) at org.apache.livy.server.recovery.FileSystemStateStore.get(FileSystemStateStore.scala:105) at org.apache.livy.server.recovery.SessionStore.getNextSessionId(SessionStore.scala:77) at org.apache.livy.sessions.SessionManager.org$apache$livy$sessions$SessionManager$$recover(SessionManager.scala:157) at org.apache.livy.sessions.SessionManager$$anonfun$1.apply(SessionManager.scala:83) at org.apache.livy.sessions.SessionManager$$anonfun$1.apply(SessionManager.scala:83) at scala.Option.getOrElse(Option.scala:120) at org.apache.livy.sessions.SessionManager.(SessionManager.scala:83) at org.apache.livy.sessions.BatchSessionManager.(SessionManager.scala:40) at org.apache.livy.server.LivyServer.start(LivyServer.scala:130) at org.apache.livy.server.LivyServer$.main(LivyServer.scala:339) at org.apache.livy.server.LivyServer.main(LivyServer.scala) 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy sending #4 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy got value #4 18/06/25 07:44:41 DEBUG ProtobufRpcEngine: Call: getListing took 2ms 18/06/25 07:44:41 INFO BatchSessionManager: Recovered 0 batch sessions. Next session id: 0 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy sending #5 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy got value #5 18/06/25 07:44:41 DEBUG RetryInvocationHandler: Exception while invoking ClientNamenodeProtocolTranslatorPB.getBlockLocations over random_host.net:8020. Not retrying because try o nce and fail. org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /livy2-recovery/v1/interactive/state at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:2025) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1996) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1909) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:700) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:377) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554) at org.apache.hadoop.ipc.Client.call(Client.java:1498) at org.apache.hadoop.ipc.Client.call(Client.java:1398) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at com.sun.proxy.$Proxy13.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:272) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185) at com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1238) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1225) at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1213) at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:309) at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:274) at org.apache.hadoop.hdfs.DFSInputStream.(DFSInputStream.java:266) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1538) at org.apache.hadoop.fs.Hdfs.open(Hdfs.java:317) at org.apache.hadoop.fs.Hdfs.open(Hdfs.java:60) at org.apache.hadoop.fs.AbstractFileSystem.open(AbstractFileSystem.java:629) at org.apache.hadoop.fs.FileContext$6.next(FileContext.java:797) at org.apache.hadoop.fs.FileContext$6.next(FileContext.java:793) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.open(FileContext.java:799) at org.apache.livy.server.recovery.FileSystemStateStore.get(FileSystemStateStore.scala:105) at org.apache.livy.server.recovery.SessionStore.getNextSessionId(SessionStore.scala:77) at org.apache.livy.sessions.SessionManager.org$apache$livy$sessions$SessionManager$$recover(SessionManager.scala:157) at org.apache.livy.sessions.SessionManager$$anonfun$1.apply(SessionManager.scala:83) at org.apache.livy.sessions.SessionManager$$anonfun$1.apply(SessionManager.scala:83) at scala.Option.getOrElse(Option.scala:120) at org.apache.livy.sessions.SessionManager.(SessionManager.scala:83) at org.apache.livy.sessions.InteractiveSessionManager.(SessionManager.scala:47) at org.apache.livy.server.LivyServer.start(LivyServer.scala:131) at org.apache.livy.server.LivyServer$.main(LivyServer.scala:339) at org.apache.livy.server.LivyServer.main(LivyServer.scala) 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy sending #6 18/06/25 07:44:41 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy got value #6 18/06/25 07:44:41 DEBUG ProtobufRpcEngine: Call: getListing took 1ms 18/06/25 07:44:41 INFO InteractiveSessionManager: Recovered 0 interactive sessions. Next session id: 0 18/06/25 07:44:41 INFO InteractiveSessionManager: Heartbeat watchdog thread started. 18/06/25 07:44:41 INFO LivyServer: CSRF protection is enabled. 18/06/25 07:44:41 INFO WebServer: Starting server on http://lrandom_host.net:8999 18/06/25 07:45:11 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy: closed 18/06/25 07:45:11 DEBUG Client: IPC Client (686956605) connection to random_host.net:8020 from livy: stopped, remaining connections 0