Support Questions

Find answers, ask questions, and share your expertise

HDP 3.1 Spark2 uses hive 1.2x jars

avatar
New Contributor

Spark on HDP-3.1 is copying hive-1.x jars in /usr/hdp/3.1.4.0-315/spark2/jars/ directory. Due to this I am getting classpath issues when I tried to interact with hive-3 server via my spark code. Any specific reason why spark2 is still using hive-1.x jars. I also am supplying hive3.x jars via --jars option but no effect. Since hive-1.x jars are added first to the classpath all the classes are loaded from hive-1.x jar by AppClassLoader. After moving them into /tmp/ directory all the jobs are working fine because MutableUrlClassLoader is loading classes from hive-3.x jars. Deleting them from /usr/hdp/3.1.4.0-315/spark2/jars/  is not a good solution. Is there any workaround to solve this issue. Any help would be appreciated. Thanks!

1 REPLY 1

avatar
Contributor

In HDP3 the default for Hive is the 1.2 jars.  In order to take advantage of the newer jars you will need to use Hive LLAP which uses Hive3.

You may also need to use the Hive Warehouse Connector.