Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

how to create a right spark-yarn-client interpreter on zeppelin(spark version 1.5.2,hdp version 2.3.4,zeppelin version 0.6.0)

avatar
New Contributor

my spark-yarn-client interpreter

7374-cusersdemonipdesktopspark-yarn-client-interpreter.png

run sc.version

7375-cusersdemonipdesktoprun-result.png

my zeppelin-env

7377-cusersdemonipdesktopzeppelin-env.png

zeppelin-zeppelin-master.easted.out in /var/log/zeppelin

7378-cusersdemonipdesktopzeppelin-zeppelin-mastereasted.png

SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/zeppelin/interpreter/spark/dep/zeppelin-spark-dependencies-0.6.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/zeppelin/interpreter/spark/zeppelin-spark-0.6.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/zeppelin/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] ------ Create new SparkContext local[*] ------- Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0 Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/zeppelin/interpreter/spark/dep/zeppelin-spark-dependencies-0.6.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/zeppelin/interpreter/spark/zeppelin-spark-0.6.0-incubating-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/zeppelin/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] ------ Create new SparkContext yarn-client ------- ------ Create new SparkContext yarn-client ------- Exception in thread "Thread-60" org.apache.zeppelin.interpreter.InterpreterException: org.apache.thrift.transport.TTransportException at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.close(RemoteInterpreter.java:178) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.close(LazyOpenInterpreter.java:78) at org.apache.zeppelin.interpreter.InterpreterGroup$1.run(InterpreterGroup.java:81) Caused by: org.apache.thrift.transport.TTransportException at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69) at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.recv_close(RemoteInterpreterService.java:198) at org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Client.close(RemoteInterpreterService.java:185) at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.close(RemoteInterpreter.java:176) ... 2 more

1 ACCEPTED SOLUTION

avatar
Expert Contributor

Does it work from the spark-shell?

I would explicitly define SPARK_HOME in the zeppelin_env_content (export SPARK_HOME=/usr/hdp/current/spark-client)

Could also try "yarn-cluster" in the interpreter screen.

View solution in original post

5 REPLIES 5

avatar
New Contributor

I think the problem may be caused by a yarn , yarn check the configuration items.

,

I think the problem may be caused by a yarn , yarn check the configuration items

avatar
Expert Contributor

Does it work from the spark-shell?

I would explicitly define SPARK_HOME in the zeppelin_env_content (export SPARK_HOME=/usr/hdp/current/spark-client)

Could also try "yarn-cluster" in the interpreter screen.

avatar
Expert Contributor

Ignore my comment on "yarn-cluster", that only applies to the livy-server.

avatar
New Contributor

thanks for your answer,I have resolved the problem.

avatar
New Contributor

How do you solve this? I met similar problem.

,

How do you resolve the question?