Member since
09-30-2014
1
Post
0
Kudos Received
0
Solutions
09-30-2014
08:37 AM
Hello, I am under Ubuntu 14.04 64bit using cloudera repository for CDH 5. I installed hadoop YARN on 1 node (with MangoDb + Mango-hadoop connector). I try to install Spark using apt but I don't manage to make it work Is there someone who knows what's happening and can tell me how to fix that ? sudo apt-get install spark-core spark-master spark-worker spark-history-server spark-python but got: Setting up spark-master (1.0.0+cdh5.1.3+45-1.cdh5.1.3.p0.10~precise-cdh5.1.3) ... * Starting Spark master (spark-master): invoke-rc.d: initscript spark-master, action "start" failed. dpkg: error processing package spark-master (--configure): subprocess installed post-installation script returned error exit status 1 Setting up spark-worker (1.0.0+cdh5.1.3+45-1.cdh5.1.3.p0.10~precise-cdh5.1.3) ... * Starting Spark worker (spark-worker): invoke-rc.d: initscript spark-worker, action "start" failed. dpkg: error processing package spark-worker (--configure): subprocess installed post-installation script returned error exit status 1 Errors were encountered while processing: spark-master spark-worker E: Sub-process /usr/bin/dpkg returned an error code (1) /var/log/spark/spark-worker.out contains: Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder at org.apache.spark.Logging$class.initializeLogging(Logging.scala:114) at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:106) at org.apache.spark.Logging$class.log(Logging.scala:45) at org.apache.spark.util.Utils$.log(Utils.scala:49) at org.apache.spark.Logging$class.logWarning(Logging.scala:70) at org.apache.spark.util.Utils$.logWarning(Utils.scala:49) at org.apache.spark.util.Utils$$anonfun$findLocalIpAddress$1$$anonfun$apply$2.apply(Utils.scala:476) at org.apache.spark.util.Utils$$anonfun$findLocalIpAddress$1$$anonfun$apply$2.apply(Utils.scala:473) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at org.apache.spark.util.Utils$$anonfun$findLocalIpAddress$1.apply(Utils.scala:473) at org.apache.spark.util.Utils$$anonfun$findLocalIpAddress$1.apply(Utils.scala:472) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:472) at org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:460) at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:460) at org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:461) at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:461) at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:508) at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:508) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.util.Utils$.localHostName(Utils.scala:508) at org.apache.spark.deploy.worker.WorkerArguments.<init>(WorkerArguments.scala:28) at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:366) at org.apache.spark.deploy.worker.Worker.main(Worker.scala) Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 26 more I tried to install some slf4j or log4j libs to make it work but I don't make it work. I tried also to play with classpath but with no success.
... View more
Labels: