<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Difference between local[*] vs yarn cluster vs yarn client for SparkConf - Java,SparkConf Master URL Configuration in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Difference-between-local-vs-yarn-cluster-vs-yarn-client-for/m-p/161246#M123625</link>
    <description>&lt;P&gt;My Scenario&lt;/P&gt;&lt;P&gt;I would like to expose a java micro service &amp;lt;Springboot appln&amp;gt; which should eventually run a spark submit to yield the required results,typically as a on demand service&lt;/P&gt;&lt;P&gt;I have been allotted with 2 data nodes and 1 edge node for development, where this edge node has the micro services deployed. When I tried yarn-cluster, got an exception 'Detected yarn-cluster mode, but isn't running on a cluster. Deployment to YARN is not supported directly by SparkContext. Please use spark-submit.'&lt;/P&gt;&lt;P&gt;Help me to get an ideal way to deal with it. What should be the approach to be looked at? Since the service is on demand, I cannot deal with YARN Client to have more Main Class than one which is already used up for springboot starter.&lt;/P&gt;&lt;P&gt;Codes here&lt;/P&gt;&lt;P&gt;MicroServiceController.java:&lt;/P&gt;&lt;P&gt;@RequestMapping(value = "/transform", method = RequestMethod.POST, consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_VALUE) &lt;/P&gt;&lt;P&gt;public String initiateTransformation(@RequestBody TransformationRequestVO requestVO){ &lt;/P&gt;&lt;P&gt;PublicationProcessor.run(); &lt;/P&gt;&lt;P&gt;return "SUCCESS"; &lt;/P&gt;&lt;P&gt;}&lt;/P&gt;&lt;P&gt;PublicationProcessor.java&lt;/P&gt;&lt;P&gt;public static void run() { &lt;/P&gt;&lt;P&gt;try{ &lt;/P&gt;&lt;P&gt;SparkConf sC   = new SparkConf().setAppName("NPUB_TRANSFORMATION_US")
.setMaster("yarn-clsuter")
.set("spark.executor.instances", PropertyBundle.getConfigurationValue("spark.executor.instances"))
.set("spark.executor.cores", PropertyBundle.getConfigurationValue("spark.executor.cores"))
.set("spark.driver.memory",PropertyBundle.getConfigurationValue("spark.driver.memory"))
.set("spark.executor.memory",PropertyBundle.getConfigurationValue("spark.executor.memory"))
.set("spark.driver.maxResultSize", PropertyBundle.getConfigurationValue("spark.driver.maxResultSize"))
.set("spark.network.timeout",PropertyBundle.getConfigurationValue("spark.network.timeout")); &lt;/P&gt;&lt;P&gt;
JavaSparkContext jSC  = new JavaSparkContext(sC); &lt;/P&gt;&lt;P&gt;sqlContext    = new SQLContext(jSC);&lt;/P&gt;&lt;P&gt;processTransformation(); &lt;/P&gt;&lt;P&gt;}catch(Exception e){&lt;/P&gt;&lt;P&gt;
System.out.println("REQUEST ABORTED..."+e.getMessage()); &lt;/P&gt;&lt;P&gt;} &lt;/P&gt;&lt;P&gt;}&lt;/P&gt;&lt;P&gt;,&lt;/P&gt;</description>
    <pubDate>Fri, 17 Mar 2017 02:43:16 GMT</pubDate>
    <dc:creator>Faisalr_ahamed</dc:creator>
    <dc:date>2017-03-17T02:43:16Z</dc:date>
  </channel>
</rss>

