Spark on HDP-3.1 is copying hive-1.x jars in /usr/hdp/3.1.4.0-315/spark2/jars/ directory. Due to this I am getting classpath issues when I tried to interact with hive-3 server via my spark code. Any specific reason why spark2 is still using hive-1.x jars. I also am supplying hive3.x jars via --jars option but no effect. Since hive-1.x jars are added first to the classpath all the classes are loaded from hive-1.x jar by AppClassLoader. After moving them into /tmp/ directory all the jobs are working fine because MutableUrlClassLoader is loading classes from hive-3.x jars. Deleting them from /usr/hdp/3.1.4.0-315/spark2/jars/ is not a good solution. Is there any workaround to solve this issue. Any help would be appreciated. Thanks!