Reply
New Contributor
Posts: 2
Registered: ‎02-10-2016

run spark streaming code in cluster

Hi All,

 

we create a sparking streaming code that is integrated with kafka. it works fine within Eclipse.

now we want to deploy it to our Cloudera cluster managed by Yarn.

 

how can we pass in the paramter like kafaka broker list, kafka topic name to the application running in the cluster?

in our eclipse we just use command line arguments: "StreamingApp <brokers> <topics>".

 

here is some example i see from cloudera website:

./bin/spark-submit --name "My app" --master local[4] --conf spark.eventLog.enabled=false
  --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" myApp.jar

Can you please how i should pass in the brokers and topics info to our application?

 

thanks

Jack

New Contributor
Posts: 2
Registered: ‎02-10-2016

Re: run spark streaming code in cluster

can somebody please help?

Announcements