Created 07-18-2019 12:31 AM
Hi Team,
I am facing an issue post CDH upgrade. I'm getting below error while running an application which was build on previous CDH version(5.14.2). At present I'm using CDH 5.16.1.
Code causing error:
val kuduContext = new KuduContext(s"$KUDU_MASTER",sc);
kuduContext.updateRows(dataframe1,s"impala::$DB.table");
Error: java.io.InvalidClassException: org.apache.kudu.spark.kudu.KuduContext; local class incompatible: stream classdesc serialVersionUID = 5028413704841024451, local class serialVersionUID = 4594263802830859603
old pom.xml dependency content using which the application was built:
<dependency>
<groupId>org.apache.kudu</groupId>
<artifactId>kudu-client</artifactId>
<version>1.6.0-cdh5.14.2</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.kudu</groupId>
<artifactId>kudu-spark2_2.11</artifactId>
<version>1.6.0-cdh5.14.2</version>
<scope>compile</scope>
</dependency>
After rebuiliding the application with new kudu version dependecy the issue got resolved.
Does it mean that each time cluster(CDH) gets upgraded, do I need to rebuild entire kudu application ?
Created 07-19-2019 09:29 AM
It looks like KuduContext doesn't have an explicitly set serialVersionUID which means that each release of spark-kudu is binary incompatible and applications will need to be recompiled. I have opened a jira to track an fix for this here: https://issues.apache.org/jira/browse/KUDU-2898
Thank you,
Grant
Created 07-25-2019 09:06 AM
Would like to bring into your notice that this is happening only in client mode and not in cluster mode. In cluster mode old application is running fine without rebuilt of an application. Could you please explain reason for this behaviour as it will be very helpful to us to understand the same.