Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Spark to Phoenix error

Spark to Phoenix error

I am trying to read a phoenix table as a dataframe and i get the following errors.

java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.GenericMutableRow cannot be cast to org.apache.spark.sql.Row

I am using HDP 2.4 and i am trying this spark program from the SCALA IDE.

Attached are the exception stack trace.

The program is also attached.phoenix-spark.txtstacktrace.txt

6 REPLIES 6

Re: Spark to Phoenix error

Re: Spark to Phoenix error

@Mukesh Kumar

Thanks for the help, but it didnt work.

I am using the correct versions

Spark 1.6.0,

Pheonix 4.4.

HDP 2.4

Any thoughts on how could i resolve this?

Highlighted

Re: Spark to Phoenix error

I think in your case Spark 1.6.0 during creation of dataframe apply a User Defined Types(UDT) and running into below issue. Could you please verify...

https://issues.apache.org/jira/browse/SPARK-12878

Re: Spark to Phoenix error

Has anyone a fix for this issue. it is more of the dependency between spark and phoenix jars. Will an upgrade to the latest version of HDP fix this issue.

Re: Spark to Phoenix error

Seems there was an issue in spark 1.5 and it was fixed too.

https://issues.apache.org/jira/browse/PHOENIX-2287

But i am still getting the issue in spark 1.6 (HDP 2.4). Are there any work arounds for it or an upgrade of HDP will help?

Re: Spark to Phoenix error

New Contributor

I have a similar problem. I could use the jdbc format but I cannot use the zurl samples https://phoenix.apache.org/phoenix_spark.html

May be solved in Spark 1.5 but the problem continues in Spark 1.6

format("org.apache.phoenix.spark")

Whe I use the write interface run well.

Please send the fix whe you find out any request.