Support Questions

Find answers, ask questions, and share your expertise

Missing required configuration "partition.assignment.strategy" which has no default value.

Hello guys,

I am working on a use case where I am running my spark streaming scala code in Hortonworks. To consume records from Kafka which is installed on my local machine. And when I ran the spark code I am getting the following error.

ERROR StreamingContext: Error starting the context, marking it as stopped org.apache.kafka.common.config.ConfigException: Missing required configuration "partition.assignment.strategy" which has no default value.


Rising Star
@sohan kanamarlapudi


May I know what version of Spark and Kafka are you using?


Hello dbains,

Thanks for the reply. I am using Spark 2.1.0 and kafka

Rising Star

Thank you @sohan kanamarlapudi.

Did you set partition.assignment.strategy to null or empty string in your properties file which is being read by your spark application?

Possible values for this property is range or roundrobin and default is [org.apache.kafka.clients.consumer.RangeAssignor]


Is it possible for you to share the code snippet where you have configured the Kafka consumer?

(Kindly omit any sensitive information)


Hi dbains,

Thanks for the reply, Actually I have resolved the issue. It is because of the version mismatch now I am able to stream records from kafka and print them in spark shell.


New Contributor

some classes from streaming provided by dependencies are overriding with different implementation 


the solution is to move all classes from package org.apache.kafka to shade


my spark version is 2.4.1


here the plugin for maven build