Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Phoenix table error connecting with spark using df

Phoenix table error connecting with spark using df

New Contributor

I am trying to save data to phoenix table using spark and dataframe. Here is my code which throws this PhysicalTableName error. 

 

```

def terminalDataIngestor(spark:SparkSession, jsonStr: String){
import spark.implicits._
val df = spark.read.json(Seq(jsonStr).toDS)

df.show()
df.write.format("org.apache.phoenix.spark")
.mode("overwrite")
.option("table", "CASHTRAIN")
.option("zKurl","127.0.0.1:2181")
.save()
}

```

 

I have this error that I am not sure what it means 

```

2020-09-08 12:46:39.970 ERROR 2324 --- [nio-8090-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoSuchMethodError: org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.setPhysicalTableName(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;)V] with root cause java.lang.NoSuchMethodError: org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil.setPhysicalTableName(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;)V at org.apache.phoenix.spark.ConfigurationUtil$.getOutputConfiguration(ConfigurationUtil.scala:42) ~[phoenix-spark-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0] at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:39) ~[phoenix-spark-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0] at org.apache.phoenix.spark.DataFrameFunctions.saveToPhoenix(DataFrameFunctions.scala:28) ~[phoenix-spark-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0] at org.apache.phoenix.spark.DefaultSource.createRelation(DefaultSource.scala:47) ~[phoenix-spark-5.0.0-HBase-2.0.jar:5.0.0-HBase-2.0] at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45) ~[spark-sql_2.11-2.3.2.jar:2.3.2] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70) ~[spark-sql_2.11-2.3.2.jar:2.3.2] at ) ~[spring-webmvc-5.2.8.RELEASE.jar:5.2.8.RELEASE] at 

```

 

@TimothySpann I read some of your posts and you might be able to help me. If so, thank you in advance!

Don't have an account?
Coming from Hortonworks? Activate your account here