I am working on a use case where I am running my spark streaming scala code in Hortonworks. To consume records from Kafka which is installed on my local machine. And when I ran the spark code I am getting the following error.
ERROR StreamingContext: Error starting the context, marking it as stopped org.apache.kafka.common.config.ConfigException: Missing required configuration "partition.assignment.strategy" which has no default value.
Thanks for the reply. I am using Spark 2.1.0 and kafka 0.10.0.0
Thank you @sohan kanamarlapudi.
Did you set partition.assignment.strategy to null or empty string in your properties file which is being read by your spark application?
Possible values for this property is range or roundrobin and default is [org.apache.kafka.clients.consumer.RangeAssignor]
Is it possible for you to share the code snippet where you have configured the Kafka consumer?
(Kindly omit any sensitive information)
Thanks for the reply, Actually I have resolved the issue. It is because of the version mismatch now I am able to stream records from kafka and print them in spark shell.