Created 02-06-2018 05:04 AM
[root@ conf]# spark-shell --master yarn --conf spark.ui.port=12234 18/01/31 09:58:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 18/01/31 09:58:56 INFO SecurityManager: Changing view acls to: root 18/01/31 09:58:56 INFO SecurityManager: Changing modify acls to: root 18/01/31 09:58:56 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 18/01/31 09:58:56 INFO HttpServer: Starting HTTP Server 18/01/31 09:58:57 INFO Server: jetty-8.y.z-SNAPSHOT 18/01/31 09:58:57 INFO AbstractConnector: Started SocketConnector@0.0.0.0:43155 18/01/31 09:58:57 INFO Utils: Successfully started service 'HTTP class server' on port 43155. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.6.2 /_/ . . . at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) <console>:16: error: not found: value sqlContext import sqlContext.implicits._ ^ <console>:16: error: not found: value sqlContext import sqlContext.sql ^ scala>
Created 02-06-2018 07:31 AM
@zkfs Anything before this line ?
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
Created 02-13-2018 01:04 AM
when launching Spark-shell getting below error [root@centos4 ~]# spark-shell --master yarn \ > --deploy-mode client \ > --conf spark.ui.port=12335 \ > --num-executors 1 \ > --executor-memory 512M 18/02/14 18:58:58 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 18/02/14 18:58:59 INFO SecurityManager: Changing view acls to: root 18/02/14 18:58:59 INFO SecurityManager: Changing modify acls to: root 18/02/14 18:58:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 18/02/14 18:59:00 INFO HttpServer: Starting HTTP Server 18/02/14 18:59:00 INFO Server: jetty-8.y.z-SNAPSHOT 18/02/14 18:59:01 INFO AbstractConnector: Started SocketConnector@0.0.0.0:39940 18/02/14 18:59:01 INFO Utils: Successfully started service 'HTTP class server' on port 39940. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.6.2 /_/ Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_60) Type in expressions to have them evaluated. Type :help for more information. 18/02/14 18:59:17 INFO SparkContext: Running Spark version 1.6.2 18/02/14 18:59:17 INFO SecurityManager: Changing view acls to: root 18/02/14 18:59:17 INFO SecurityManager: Changing modify acls to: root 18/02/14 18:59:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 18/02/14 18:59:19 INFO Utils: Successfully started service 'sparkDriver' on port 45694. 18/02/14 18:59:23 INFO Slf4jLogger: Slf4jLogger started 18/02/14 18:59:24 INFO Remoting: Starting remoting 18/02/14 18:59:25 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.154.114:43865] 18/02/14 18:59:25 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 43865. 18/02/14 18:59:25 INFO SparkEnv: Registering MapOutputTracker 18/02/14 18:59:25 INFO SparkEnv: Registering BlockManagerMaster 18/02/14 18:59:25 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-acbf69ba-bf4b-4fae-9d28-ae78d9b60aca 18/02/14 18:59:25 INFO MemoryStore: MemoryStore started with capacity 511.1 MB 18/02/14 18:59:26 INFO SparkEnv: Registering OutputCommitCoordinator 18/02/14 18:59:26 INFO Server: jetty-8.y.z-SNAPSHOT 18/02/14 18:59:26 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:12335 18/02/14 18:59:26 INFO Utils: Successfully started service 'SparkUI' on port 12335. 18/02/14 18:59:26 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.154.114:12335 spark.yarn.driver.memoryOverhead is set but does not apply in client mode. 18/02/14 18:59:28 INFO TimelineClientImpl: Timeline service address: http://centos1.test.com:8188/ws/v1/timeline/ 18/02/14 18:59:29 INFO RMProxy: Connecting to ResourceManager at centos1.test.com/192.168.154.112:8050 18/02/14 18:59:31 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 18/02/14 18:59:31 INFO Client: Requesting a new application from cluster with 2 NodeManagers 18/02/14 18:59:31 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (1024 MB per container) 18/02/14 18:59:31 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 18/02/14 18:59:31 INFO Client: Setting up container launch context for our AM 18/02/14 18:59:31 INFO Client: Setting up the launch environment for our AM container 18/02/14 18:59:31 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://centos.test.com:8020/hdp/apps/2.4.3.0-227/spark/spark-hdp-assembly.jar 18/02/14 18:59:31 INFO Client: Preparing resources for our AM container 18/02/14 18:59:31 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://centos.test.com:8020/hdp/apps/2.4.3.0-227/spark/spark-hdp-assembly.jar 18/02/14 18:59:31 INFO Client: Source and destination file systems are the same. Not copying hdfs://centos.test.com:8020/hdp/apps/2.4.3.0-227/spark/spark-hdp-assembly.jar 18/02/14 18:59:31 INFO Client: Uploading resource file:/tmp/spark-611a1716-891c-4b5b-84aa-3eeebb204084/__spark_conf__3734921673150808950.zip -> hdfs://centos.test.com:8020/user/root/.sparkStaging/application_1518599648055_0002/__spark_conf__3734921673150808950.zip 18/02/14 18:59:31 INFO SecurityManager: Changing view acls to: root 18/02/14 18:59:31 INFO SecurityManager: Changing modify acls to: root 18/02/14 18:59:31 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 18/02/14 18:59:31 INFO Client: Submitting application 2 to ResourceManager 18/02/14 18:59:32 INFO YarnClientImpl: Submitted application application_1518599648055_0002 18/02/14 18:59:32 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1518599648055_0002 and attemptId None 18/02/14 18:59:33 INFO Client: Application report for application_1518599648055_0002 (state: ACCEPTED) 18/02/14 18:59:33 INFO Client: client token: N/A diagnostics: N/A ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1518600178504 final status: UNDEFINED tracking URL: http://centos1.test.com:8088/proxy/application_1518599648055_0002/ user: root 18/02/14 18:59:34 INFO Client: Application report for application_1518599648055_0002 (state: ACCEPTED) 18/02/14 18:59:35 INFO Client: Application report for application_1518599648055_0002 (state: FAILED) 18/02/14 18:59:35 INFO Client: client token: N/A diagnostics: Application application_1518599648055_0002 failed 2 times due to Error launching appattempt_1518599648055_0002_000002. Got exception: org.apache.hadoop.yarn.exceptions.YarnException: Unauthorized request to start container. This token is expired. current time is 1518614975052 found 1518600781196 Note: System times on machines may be out of sync. Check system time and time zones. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168) at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106) at org.apache.hadoop.yarn.server.resourcemanager.amlauncher.AMLauncher.launch(AMLauncher.java:122) at org.apache.hadoop.yarn.server.resourcemanager.amlauncher.AMLauncher.run(AMLauncher.java:250) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) . Failing the application. ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1518600178504 final status: FAILED tracking URL: http://centos1.test.com:8088/cluster/app/application_1518599648055_0002 user: root 18/02/14 18:59:35 INFO Client: Deleting staging directory .sparkStaging/application_1518599648055_0002 18/02/14 18:59:35 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:122) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144) at org.apache.spark.SparkContext.<init>(SparkContext.scala:530) at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017) at $line3.$read$$iwC$$iwC.<init>(<console>:15) at $line3.$read$$iwC.<init>(<console>:24) at $line3.$read.<init>(<console>:26) at $line3.$read$.<init>(<console>:30) at $line3.$read$.<clinit>(<console>) at $line3.$eval$.<init>(<console>:7) at $line3.$eval$.<clinit>(<console>) at $line3.$eval.$print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} 18/02/14 18:59:35 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} 18/02/14 18:59:35 INFO SparkUI: Stopped Spark web UI at http://192.168.154.114:12335 18/02/14 18:59:35 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered! 18/02/14 18:59:35 INFO YarnClientSchedulerBackend: Shutting down all executors 18/02/14 18:59:35 INFO YarnClientSchedulerBackend: Asking each executor to shut down 18/02/14 18:59:35 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 18/02/14 18:59:35 INFO YarnClientSchedulerBackend: Stopped 18/02/14 18:59:35 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 18/02/14 18:59:35 INFO MemoryStore: MemoryStore cleared 18/02/14 18:59:35 INFO BlockManager: BlockManager stopped 18/02/14 18:59:35 INFO BlockManagerMaster: BlockManagerMaster stopped 18/02/14 18:59:35 WARN MetricsSystem: Stopping a MetricsSystem that is not running 18/02/14 18:59:35 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 18/02/14 18:59:35 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 18/02/14 18:59:35 INFO SparkContext: Successfully stopped SparkContext 18/02/14 18:59:35 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 18/02/14 18:59:36 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:122) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144) at org.apache.spark.SparkContext.<init>(SparkContext.scala:530) at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017) at $iwC$$iwC.<init>(<console>:15) at $iwC.<init>(<console>:24) at <init>(<console>:26) at .<init>(<console>:30) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) java.lang.NullPointerException at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028) at $iwC$$iwC.<init>(<console>:15) at $iwC.<init>(<console>:24) at <init>(<console>:26) at .<init>(<console>:30) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) <console>:16: error: not found: value sqlContext import sqlContext.implicits._ ^ <console>:16: error: not found: value sqlContext import sqlContext.sql