I am trying to read a phoenix table as a dataframe and i get the following errors.
java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.GenericMutableRow cannot be cast to org.apache.spark.sql.Row
I am using HDP 2.4 and i am trying this spark program from the SCALA IDE.
Attached are the exception stack trace.
Has anyone a fix for this issue. it is more of the dependency between spark and phoenix jars. Will an upgrade to the latest version of HDP fix this issue.
Seems there was an issue in spark 1.5 and it was fixed too.
But i am still getting the issue in spark 1.6 (HDP 2.4). Are there any work arounds for it or an upgrade of HDP will help?
I have a similar problem. I could use the jdbc format but I cannot use the zurl samples https://phoenix.apache.org/phoenix_spark.html
May be solved in Spark 1.5 but the problem continues in Spark 1.6
Whe I use the write interface run well.
Please send the fix whe you find out any request.