Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

run spark streaming code in cluster

Highlighted

run spark streaming code in cluster

New Contributor

Hi All,

 

we create a sparking streaming code that is integrated with kafka. it works fine within Eclipse.

now we want to deploy it to our Cloudera cluster managed by Yarn.

 

how can we pass in the paramter like kafaka broker list, kafka topic name to the application running in the cluster?

in our eclipse we just use command line arguments: "StreamingApp <brokers> <topics>".

 

here is some example i see from cloudera website:

./bin/spark-submit --name "My app" --master local[4] --conf spark.eventLog.enabled=false
  --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" myApp.jar

Can you please how i should pass in the brokers and topics info to our application?

 

thanks

Jack

1 REPLY 1

Re: run spark streaming code in cluster

New Contributor

can somebody please help?