Support Questions

Find answers, ask questions, and share your expertise

SHC on HDP 3.0 With spark 2.4

avatar

I am running spark 2.4 on Azure HDI version 4.0, which includes HDP 3.0.

After the upgrade, we are unable to use spark-hbase connector version 1.1.1-2.1-s_2.11, it fails with below error

java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.parse(Lorg/json4s/JsonInput;Z)Lorg/json4s/JsonAST$JValue;  at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:257)  at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:80)  at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:51)  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)  at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)  at withCatalog(<console>:42)  ... 56 elided

As per github issue [https://github.com/hortonworks-spark/shc/issues/294 ], I tried building shc master locally and deploy, but ran into several dependency conflicts.

Do we have guidelines / work arounds to get shc working with spark 2.4 on HDP 3.0?

1 ACCEPTED SOLUTION

avatar

I was able to solve this by modifying SHC source code and package it using maven shade plugin.

By doing this, all the required dependencies are packaged with shc jar

 

https://github.com/dhananjaypatkar/shc

 

 

View solution in original post

1 REPLY 1

avatar

I was able to solve this by modifying SHC source code and package it using maven shade plugin.

By doing this, all the required dependencies are packaged with shc jar

 

https://github.com/dhananjaypatkar/shc