- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Missing required configuration "partition.assignment.strategy" which has no default value.
- Labels:
-
Apache Kafka
-
Apache Spark
Created on ‎07-09-2018 09:08 PM - edited ‎09-16-2022 06:26 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello guys,
I am working on a use case where I am running my spark streaming scala code in Hortonworks. To consume records from Kafka which is installed on my local machine. And when I ran the spark code I am getting the following error.
ERROR StreamingContext: Error starting the context, marking it as stopped org.apache.kafka.common.config.ConfigException: Missing required configuration "partition.assignment.strategy" which has no default value.
Created ‎07-10-2018 08:37 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Created ‎07-11-2018 02:22 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello dbains,
Thanks for the reply. I am using Spark 2.1.0 and kafka 0.10.0.0
Created ‎07-11-2018 06:52 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you @sohan kanamarlapudi.
Did you set partition.assignment.strategy to null or empty string in your properties file which is being read by your spark application?
Possible values for this property is range or roundrobin and default is [org.apache.kafka.clients.consumer.RangeAssignor]
Reference: https://kafka.apache.org/0100/documentation.html#newconsumerconfigs
Is it possible for you to share the code snippet where you have configured the Kafka consumer?
(Kindly omit any sensitive information)
Thanks!
Created ‎07-12-2018 02:09 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi dbains,
Thanks for the reply, Actually I have resolved the issue. It is because of the version mismatch now I am able to stream records from kafka and print them in spark shell.
Thanks
Created ‎10-31-2022 02:34 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
some classes from streaming provided by dependencies are overriding with different implementation
the solution is to move all classes from package org.apache.kafka to shade
my spark version is 2.4.1
here the plugin for maven build
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>org.apache.kafka</pattern>
<shadedPattern>shade.org.apache.kafka</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
