Support Questions
Find answers, ask questions, and share your expertise

Unable to run unit tests after adding dependency on Hive Warehouse Connector

New Contributor

I am trying to use the Hive warehouse connector (hive-warehouse-connector_2.11-1.0.0.3.1.0.53-1.jar) in one of our Spark applications. After I added the dependency for the same in build.sbt, I started getting the following exception while running the unit tests:

 

java.lang.SecurityException: class "org.codehaus.janino.JaninoRuntimeException"'s signer information does not match signer information of other classes in the same package
at java.lang.ClassLoader.checkCerts(ClassLoader.java:898)
at java.lang.ClassLoader.preDefineClass(ClassLoader.java:668)
at java.lang.ClassLoader.defineClass(ClassLoader.java:761)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:197)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:36)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1321)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3272)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3254)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3253)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2484)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2698)

 

My understanding is that this is happening because the janino library classes are present in 2 jars

  • janino-3.0.8.jar (this is a dependency of spark-sql)
  • hive-warehouse-connector_2.11-1.0.0.3.1.0.53-1.jar

and that these 2 jars have different signatures.

 

Can somebody help me resolve this?

 

2 REPLIES 2

Re: Unable to run unit tests after adding dependency on Hive Warehouse Connector

New Contributor

 HiveWarehouseConnector jar in classpath should be after Spark SQL jar.

Re: Unable to run unit tests after adding dependency on Hive Warehouse Connector

Explorer

Hi @sappu ,

I want to dump Spark Dataframe data to Hive table using the Hive Warehouse Connector. I am running a Spark application from spark-shell.

DF.write.format(HiveWarehouseSession.HIVE_WAREHOUSE_CONNECTOR).mode(SaveMode.Overwrite).option("table","Demo").save()

Sometimes it loads data into Hive table and sometimes throws below exception:

Caused by: java.lang.SecurityException: class "org.codehaus.janino.JaninoRuntimeException"'s signer information does not match signer information of other classes in the same package

 

I have set below spark classpath:
export CLASSPATH=/usr/hdp/3.0.1.0-187/spark2/jars/spark-sql_2.11-2.3.1.3.0.1.0-187.jar:/usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.1.0-187.jar

Still, it throws an error.