Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

./spark-shell dont starts corretlcy

avatar
Rising Star

I have a hadoop single node configured and also hive. And I want to acess hive with spark.

For hadoop I configured core-site.xml and hdfs-site.xml.

For hive I just extract the folder and configure the environment variables in .bashrc file.

For spark-1.6.1-bin-hadoop2.6 I just did the same like hive, I extract than configured the variables in .bashrc file.

Now when I start spark-shell I get a lot of error messages as you can see below.

Can you please give some help to try understand why this errors are happening?

I add this two lines below in the spark-env.sh.template file as Rich Raposa recommended but the issue continues:

export  SPARK_MASTER_IP=127.0.0.1 
export  SPARK_LOCAL_IP=127.0.0.1

errors with SPARK_LOCAL_IP=127.0.0.1 in .bashrc file:

[hdoopadmin@hadoop ~]$ spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
  ____  __
  / __/__  ___ _____/ /__
  _\ \/ _ \/ _ `/ __/  '_/
  /___/ .__/\_,_/_/ /_/\_\  version 1.6.1
  /_/

Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
16/03/30 19:50:59 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/03/30 19:51:00 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/03/30 19:51:07 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/03/30 19:51:07 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/03/30 19:51:12 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/03/30 19:51:12 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
SQL context available as sqlContext.

scala> 

full errors without SPARK_LOCAL_IP=127.0.0.1 in .bashrc file:

[hdoopadmin@hadoop ~]$ spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
  ____  __
  / __/__  ___ _____/ /__
  _\ \/ _ \/ _ `/ __/  '_/
  /___/ .__/\_,_/_/ /_/\_\  version 1.6.1
  /_/
Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/03/30 13:37:43 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries!
   at sun.nio.ch.Net.bind0(Native Method)
   at sun.nio.ch.Net.bind(Net.java:433)
   at sun.nio.ch.Net.bind(Net.java:425)
   at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
   at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
   at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
   at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
   at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
   at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
   at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
   at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
   at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
   at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
   at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
   at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
   at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
   at java.lang.Thread.run(Thread.java:745)
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries!
   at sun.nio.ch.Net.bind0(Native Method)
   at sun.nio.ch.Net.bind(Net.java:433)
   at sun.nio.ch.Net.bind(Net.java:425)
   at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
   at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
   at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
   at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
   at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
   at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
   at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
   at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
   at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
   at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
   at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
   at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
   at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
   at java.lang.Thread.run(Thread.java:745)
java.lang.NullPointerException
   at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
   at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
   at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
   at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
   at $iwC$$iwC.<init>(<console>:15)
   at $iwC.<init>(<console>:24)
   at <init>(<console>:26)
   at .<init>(<console>:30)
   at .<clinit>(<console>)
   at .<init>(<console>:7)
   at .<clinit>(<console>)
   at $print(<console>)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:498)
   at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
   at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
   at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
   at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
   at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
   at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
   at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
   at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
   at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
   at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
   at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
   at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
   at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
   at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
   at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
   at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
   at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
   at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
   at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
   at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
   at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
   at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
   at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
   at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
   at org.apache.spark.repl.Main$.main(Main.scala:31)
   at org.apache.spark.repl.Main.main(Main.scala)
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   at java.lang.reflect.Method.invoke(Method.java:498)
   at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
   at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
   at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
   at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
   at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
<console>:16: error: not found: value sqlContext
  import sqlContext.implicits._
  ^
<console>:16: error: not found: value sqlContext
  import sqlContext.sql
  ^

scala>

1 ACCEPTED SOLUTION

avatar
Guru

@John Cod - Try adding the following environment variables to spark-env.sh (found in the /conf folder of your Spark install) - using the appropriate IP address of course if Spark is running on a machine other than localhost:

export  SPARK_MASTER_IP=127.0.0.1 
export  SPARK_LOCAL_IP=127.0.0.1

View solution in original post

4 REPLIES 4

avatar
Guru

@John Cod - Try adding the following environment variables to spark-env.sh (found in the /conf folder of your Spark install) - using the appropriate IP address of course if Spark is running on a machine other than localhost:

export  SPARK_MASTER_IP=127.0.0.1 
export  SPARK_LOCAL_IP=127.0.0.1

avatar
Rising Star

Thanks for your answer. I try your solution but its not working! I get the same issue. I add that two lines in spark-snv.sh.template file but get the same errors!

avatar
Guru

Try using your hostname instead of the IP address. From the machine running the spark master process, run:

$ hostname

Then use that value instead of 127.0.0.1.

avatar
New Member

I have a 2 node configuration. I tried 127.0.0.1, internal IP, and hostname. Same results as @John Code. I did add spark through Ambari. Not sure what other configuration it might be missing. No matter what I do:

16/07/27 14:30:30 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/07/27 14:30:30 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries!