Support Questions
Find answers, ask questions, and share your expertise

Spark to Phoenix error

I am trying to read a phoenix table as a dataframe and i get the following errors.

java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.GenericMutableRow cannot be cast to org.apache.spark.sql.Row

I am using HDP 2.4 and i am trying this spark program from the SCALA IDE.

Attached are the exception stack trace.

The program is also attached.phoenix-spark.txtstacktrace.txt


Re: Spark to Phoenix error

Re: Spark to Phoenix error

@Mukesh Kumar

Thanks for the help, but it didnt work.

I am using the correct versions

Spark 1.6.0,

Pheonix 4.4.

HDP 2.4

Any thoughts on how could i resolve this?

Re: Spark to Phoenix error

I think in your case Spark 1.6.0 during creation of dataframe apply a User Defined Types(UDT) and running into below issue. Could you please verify...

Re: Spark to Phoenix error

Has anyone a fix for this issue. it is more of the dependency between spark and phoenix jars. Will an upgrade to the latest version of HDP fix this issue.

Re: Spark to Phoenix error

Seems there was an issue in spark 1.5 and it was fixed too.

But i am still getting the issue in spark 1.6 (HDP 2.4). Are there any work arounds for it or an upgrade of HDP will help?

Re: Spark to Phoenix error

I have a similar problem. I could use the jdbc format but I cannot use the zurl samples

May be solved in Spark 1.5 but the problem continues in Spark 1.6


Whe I use the write interface run well.

Please send the fix whe you find out any request.