Created on 07-09-2018 09:08 PM - edited 09-16-2022 06:26 AM
I am working on a use case where I am running my spark streaming scala code in Hortonworks. To consume records from Kafka which is installed on my local machine. And when I ran the spark code I am getting the following error.
ERROR StreamingContext: Error starting the context, marking it as stopped org.apache.kafka.common.config.ConfigException: Missing required configuration "partition.assignment.strategy" which has no default value.
Created 07-10-2018 08:37 PM
Created 07-11-2018 02:22 PM
Thanks for the reply. I am using Spark 2.1.0 and kafka 0.10.0.0
Created 07-11-2018 06:52 PM
Thank you @sohan kanamarlapudi.
Did you set partition.assignment.strategy to null or empty string in your properties file which is being read by your spark application?
Possible values for this property is range or roundrobin and default is [org.apache.kafka.clients.consumer.RangeAssignor]
Is it possible for you to share the code snippet where you have configured the Kafka consumer?
(Kindly omit any sensitive information)
Created 07-12-2018 02:09 AM
Thanks for the reply, Actually I have resolved the issue. It is because of the version mismatch now I am able to stream records from kafka and print them in spark shell.
Created 10-31-2022 02:34 AM
some classes from streaming provided by dependencies are overriding with different implementation
the solution is to move all classes from package org.apache.kafka to shade
my spark version is 2.4.1
here the plugin for maven build