Support Questions
Find answers, ask questions, and share your expertise

sparkR.session() is not working in client mode

New Contributor

> sparkEnvir <- list(spark.num.executors='3', spark.executor.cores='3')

> sparkR.session(master = "yarn-client", deployMode="client", sparkConfig = list(spark.driver.memory = "2g"), sparkEnvir = sparkEnvir  )
Launching java with spark-submit command /home/ubuntu/spark/bin/spark-submit   --driver-memory "2g" sparkr-shell /tmp/Rtmp3sXJgh/backend_port468d5684d1e1 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.propertiesSetting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).17/08/21 21:53:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/08/21 21:53:53 WARN SparkConf: spark.master yarn-client is deprecated in Spark 2.0+, please instead use "yarn" with specified deploy mode.
17/08/21 21:53:55 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.