Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error creating a table using the hive warehouse connector and spark

Error creating a table using the hive warehouse connector and spark

New Contributor

Hi,

Executing a unit test that creates a hive table using the new connector fails, the code is

package com.jdbm.test

import com.hortonworks.hwc.HiveWarehouseSession
import org.apache.spark.sql.SparkSession
import org.scalatest.{FunSuite, Matchers}

class CatalogTest extends FunSuite with Matchers{
  test("should get the spark catalog") {

    val spark = SparkSession.builder()
      .config("spark.sql.hive.hiveserver2.jdbc.url","jdbc:hive2:///")
      .master("local[2]")
      .getOrCreate()
    val hive = HiveWarehouseSession.session(spark).build()
    hive.createTable("transformed_table")
      .column("a","String")
      .column("b", "String")
      .partition("c","String")
      .create()

  }

}

My hive-site.xml

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->

<configuration>
    <property>
        <name>hive.in.test</name>
        <value>true</value>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:derby:memory:metastore_db;create=true</value>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>org.apache.derby.jdbc.EmbeddedDriver</value>
    </property>

</configuration>

and the error is:


SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/jorgedelmonte/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/jorgedelmonte/.m2/repository/com/hortonworks/hive/hive-warehouse-connector_2.11/1.0.0.3.1.0.0-78/hive-warehouse-connector_2.11-1.0.0.3.1.0.0-78.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/08 14:58:41 INFO SparkContext: Running Spark version 2.3.2
19/05/08 14:58:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/05/08 14:58:41 INFO SparkContext: Submitted application: 1ffb703b-0fd5-4801-a6b4-7382d3e7d9e1
19/05/08 14:58:41 INFO SecurityManager: Changing view acls to: jorgedelmonte
19/05/08 14:58:41 INFO SecurityManager: Changing modify acls to: jorgedelmonte
19/05/08 14:58:41 INFO SecurityManager: Changing view acls groups to: 
19/05/08 14:58:41 INFO SecurityManager: Changing modify acls groups to: 
19/05/08 14:58:41 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jorgedelmonte); groups with view permissions: Set(); users  with modify permissions: Set(jorgedelmonte); groups with modify permissions: Set()
19/05/08 14:58:42 INFO Utils: Successfully started service 'sparkDriver' on port 55325.
19/05/08 14:58:42 INFO SparkEnv: Registering MapOutputTracker
19/05/08 14:58:42 INFO SparkEnv: Registering BlockManagerMaster
19/05/08 14:58:42 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/08 14:58:42 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/08 14:58:42 INFO DiskBlockManager: Created local directory at /private/var/folders/kq/kczs__gd0y741jdzq492xxj40000gp/T/blockmgr-f4702e61-b5e3-435f-9897-81c5cdf90734
19/05/08 14:58:42 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
19/05/08 14:58:42 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/08 14:58:42 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/05/08 14:58:42 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.33.146.118:4040
19/05/08 14:58:42 INFO Executor: Starting executor ID driver on host localhost
19/05/08 14:58:42 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55326.
19/05/08 14:58:42 INFO NettyBlockTransferService: Server created on 10.33.146.118:55326
19/05/08 14:58:42 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/08 14:58:42 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.33.146.118, 55326, None)
19/05/08 14:58:42 INFO BlockManagerMasterEndpoint: Registering block manager 10.33.146.118:55326 with 2004.6 MB RAM, BlockManagerId(driver, 10.33.146.118, 55326, None)
19/05/08 14:58:42 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.33.146.118, 55326, None)
19/05/08 14:58:42 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.33.146.118, 55326, None)
19/05/08 14:58:43 INFO SharedState: loading hive config file: file:/Users/jorgedelmonte/Documents/test/hive3-test/target/test-classes/hive-site.xml
19/05/08 14:58:43 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/jorgedelmonte/Documents/test/hive3-test/spark-warehouse').
19/05/08 14:58:43 INFO SharedState: Warehouse path is 'file:/Users/jorgedelmonte/Documents/test/hive3-test/spark-warehouse'.
19/05/08 14:58:43 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
19/05/08 14:58:44 INFO HWConf: Using HS2 URL: jdbc:hive2:///
19/05/08 14:58:44 INFO HiveConf: Found configuration file file:/Users/jorgedelmonte/Documents/test/hive3-test/target/test-classes/hive-site.xml
Hive Session ID = 2ef5e054-be4b-44a1-a256-afcbf1147cc2
19/05/08 14:58:44 INFO SessionState: Hive Session ID = 2ef5e054-be4b-44a1-a256-afcbf1147cc2
19/05/08 14:58:44 INFO SessionState: Created HDFS directory: /tmp/hive/jorgedelmonte/2ef5e054-be4b-44a1-a256-afcbf1147cc2
19/05/08 14:58:44 INFO SessionState: Created local directory: /var/folders/kq/kczs__gd0y741jdzq492xxj40000gp/T/jorgedelmonte/2ef5e054-be4b-44a1-a256-afcbf1147cc2
19/05/08 14:58:44 INFO SessionState: Created HDFS directory: /tmp/hive/jorgedelmonte/2ef5e054-be4b-44a1-a256-afcbf1147cc2/_tmp_space.db
19/05/08 14:58:44 INFO SQLStdHiveAccessController: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=2ef5e054-be4b-44a1-a256-afcbf1147cc2, clientType=HIVESERVER2]
19/05/08 14:58:44 WARN SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
19/05/08 14:58:45 INFO HiveMetaStore: 0: Opening raw store with implementation class:shadehive.org.apache.hadoop.hive.metastore.ObjectStore
19/05/08 14:58:46 WARN ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
19/05/08 14:58:46 INFO ObjectStore: RawStore: shadehive.org.apache.hadoop.hive.metastore.ObjectStore@71936a92, with PersistenceManager: null will be shutdown
19/05/08 14:58:46 INFO ObjectStore: ObjectStore, initialize called
19/05/08 14:58:46 INFO MetastoreConf: Found configuration file file:/Users/jorgedelmonte/Documents/test/hive3-test/target/test-classes/hive-site.xml
19/05/08 14:58:46 INFO MetastoreConf: Unable to find config file hivemetastore-site.xml
19/05/08 14:58:46 INFO MetastoreConf: Found configuration file null
19/05/08 14:58:46 INFO MetastoreConf: Unable to find config file metastore-site.xml
19/05/08 14:58:46 INFO MetastoreConf: Found configuration file null
19/05/08 14:58:46 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
19/05/08 14:58:46 INFO HikariDataSource: HikariPool-1 - Starting...
19/05/08 14:58:46 WARN DriverDataSource: Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation.
19/05/08 14:58:47 INFO PoolBase: HikariPool-1 - Driver does not support get/set network timeout for connections. (Feature not implemented: No details.)
19/05/08 14:58:47 INFO HikariDataSource: HikariPool-1 - Start completed.
19/05/08 14:58:47 INFO HikariDataSource: HikariPool-2 - Starting...
19/05/08 14:58:47 WARN DriverDataSource: Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation.
19/05/08 14:58:47 INFO PoolBase: HikariPool-2 - Driver does not support get/set network timeout for connections. (Feature not implemented: No details.)
19/05/08 14:58:47 INFO HikariDataSource: HikariPool-2 - Start completed.
19/05/08 14:58:47 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
19/05/08 14:58:48 INFO ObjectStore: RawStore: shadehive.org.apache.hadoop.hive.metastore.ObjectStore@71936a92, with PersistenceManager: org.datanucleus.api.jdo.JDOPersistenceManager@28dd81ad created in the thread with id: 1
19/05/08 14:58:48 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:48 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:48 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:48 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:48 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:48 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:49 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:49 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:49 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:49 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:49 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:49 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:50 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:50 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:50 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:50 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:50 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:50 WARN MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
19/05/08 14:58:52 WARN MetaStoreDirectSql: Database initialization failed; direct SQL is disabled
java.lang.NullPointerException
    at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:318)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:865)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:347)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1816)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
    at org.datanucleus.store.query.Query.execute(Query.java:1726)
    at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:374)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
    at shadehive.org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:243)
    at shadehive.org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:187)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:511)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:430)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:383)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
    at shadehive.org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:59)
    at shadehive.org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:730)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:698)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:692)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:779)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:540)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:9138)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:170)
    at shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:96)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4656)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4724)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4704)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4995)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:311)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:294)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:464)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:393)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:380)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:351)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:957)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:921)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1890)
    at shadehive.org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:134)
    at shadehive.org.apache.hive.service.cli.CLIService.init(CLIService.java:118)
    at shadehive.org.apache.hive.service.cli.thrift.EmbeddedThriftBinaryCLIService.init(EmbeddedThriftBinaryCLIService.java:63)
    at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:282)
    at shadehive.org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
    at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39)
    at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
    at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2301)
    at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2287)
    at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2038)
    at org.apache.commons.dbcp2.BasicDataSource.getLogWriter(BasicDataSource.java:1588)
    at org.apache.commons.dbcp2.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:588)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:333)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:340)
    at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.getConnector(HS2JDBCWrapper.scala)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$0(HiveWarehouseSessionImpl.java:48)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeUpdate(HiveWarehouseSessionImpl.java:75)
    at com.hortonworks.spark.sql.hive.llap.CreateTableBuilder.create(CreateTableBuilder.java:79)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply$mcV$sp(CatalogTest.scala:19)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    at org.scalatest.Transformer.apply(Transformer.scala:22)
    at org.scalatest.Transformer.apply(Transformer.scala:20)
    at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
    at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    at org.scalatest.Suite$class.run(Suite.scala:1147)
    at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    at org.scalatest.FunSuite.run(FunSuite.scala:1560)
    at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1346)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1340)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
    at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1506)
    at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
    at org.scalatest.tools.Runner$.run(Runner.scala:850)
    at org.scalatest.tools.Runner.run(Runner.scala)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
19/05/08 14:58:52 INFO ObjectStore: Initialized ObjectStore
19/05/08 14:58:52 ERROR RetryingHMSHandler: java.lang.NullPointerException
    at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:318)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:865)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:347)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1816)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
    at org.datanucleus.store.query.Query.execute(Query.java:1726)
    at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:374)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:9490)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:9474)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:9431)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:9416)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97)
    at com.sun.proxy.$Proxy22.verifySchema(Unknown Source)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:700)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:692)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:779)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:540)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:9138)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:170)
    at shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:96)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4656)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4724)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4704)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4995)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:311)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:294)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:464)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:393)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:380)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:351)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:957)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:921)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1890)
    at shadehive.org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:134)
    at shadehive.org.apache.hive.service.cli.CLIService.init(CLIService.java:118)
    at shadehive.org.apache.hive.service.cli.thrift.EmbeddedThriftBinaryCLIService.init(EmbeddedThriftBinaryCLIService.java:63)
    at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:282)
    at shadehive.org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
    at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39)
    at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
    at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2301)
    at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2287)
    at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2038)
    at org.apache.commons.dbcp2.BasicDataSource.getLogWriter(BasicDataSource.java:1588)
    at org.apache.commons.dbcp2.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:588)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:333)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:340)
    at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.getConnector(HS2JDBCWrapper.scala)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$0(HiveWarehouseSessionImpl.java:48)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeUpdate(HiveWarehouseSessionImpl.java:75)
    at com.hortonworks.spark.sql.hive.llap.CreateTableBuilder.create(CreateTableBuilder.java:79)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply$mcV$sp(CatalogTest.scala:19)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    at org.scalatest.Transformer.apply(Transformer.scala:22)
    at org.scalatest.Transformer.apply(Transformer.scala:20)
    at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
    at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    at org.scalatest.Suite$class.run(Suite.scala:1147)
    at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    at org.scalatest.FunSuite.run(FunSuite.scala:1560)
    at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1346)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1340)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
    at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1506)
    at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
    at org.scalatest.tools.Runner$.run(Runner.scala:850)
    at org.scalatest.tools.Runner.run(Runner.scala)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)

19/05/08 14:58:52 ERROR RetryingHMSHandler: HMSHandler Fatal error: java.lang.NullPointerException
    at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:318)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:865)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:347)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1816)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
    at org.datanucleus.store.query.Query.execute(Query.java:1726)
    at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:374)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:9490)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:9474)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:9431)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:9416)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97)
    at com.sun.proxy.$Proxy22.verifySchema(Unknown Source)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:700)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:692)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:779)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:540)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:9138)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:170)
    at shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:96)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4656)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4724)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4704)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4995)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:311)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:294)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:464)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:393)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:380)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:351)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:957)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:921)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1890)
    at shadehive.org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:134)
    at shadehive.org.apache.hive.service.cli.CLIService.init(CLIService.java:118)
    at shadehive.org.apache.hive.service.cli.thrift.EmbeddedThriftBinaryCLIService.init(EmbeddedThriftBinaryCLIService.java:63)
    at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:282)
    at shadehive.org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
    at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39)
    at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
    at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2301)
    at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2287)
    at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2038)
    at org.apache.commons.dbcp2.BasicDataSource.getLogWriter(BasicDataSource.java:1588)
    at org.apache.commons.dbcp2.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:588)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:333)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:340)
    at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.getConnector(HS2JDBCWrapper.scala)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$0(HiveWarehouseSessionImpl.java:48)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeUpdate(HiveWarehouseSessionImpl.java:75)
    at com.hortonworks.spark.sql.hive.llap.CreateTableBuilder.create(CreateTableBuilder.java:79)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply$mcV$sp(CatalogTest.scala:19)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    at org.scalatest.Transformer.apply(Transformer.scala:22)
    at org.scalatest.Transformer.apply(Transformer.scala:20)
    at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
    at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    at org.scalatest.Suite$class.run(Suite.scala:1147)
    at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    at org.scalatest.FunSuite.run(FunSuite.scala:1560)
    at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1346)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1340)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
    at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1506)
    at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
    at org.scalatest.tools.Runner$.run(Runner.scala:850)
    at org.scalatest.tools.Runner.run(Runner.scala)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)

19/05/08 14:58:52 WARN Hive: Failed to register all functions.
java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:86)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4656)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4724)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4704)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4995)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:311)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:294)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:464)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:393)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:380)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:351)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:957)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:921)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1890)
    at shadehive.org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:134)
    at shadehive.org.apache.hive.service.cli.CLIService.init(CLIService.java:118)
    at shadehive.org.apache.hive.service.cli.thrift.EmbeddedThriftBinaryCLIService.init(EmbeddedThriftBinaryCLIService.java:63)
    at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:282)
    at shadehive.org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
    at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39)
    at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
    at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2301)
    at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2287)
    at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2038)
    at org.apache.commons.dbcp2.BasicDataSource.getLogWriter(BasicDataSource.java:1588)
    at org.apache.commons.dbcp2.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:588)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:333)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:340)
    at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.getConnector(HS2JDBCWrapper.scala)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$0(HiveWarehouseSessionImpl.java:48)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeUpdate(HiveWarehouseSessionImpl.java:75)
    at com.hortonworks.spark.sql.hive.llap.CreateTableBuilder.create(CreateTableBuilder.java:79)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply$mcV$sp(CatalogTest.scala:19)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    at org.scalatest.Transformer.apply(Transformer.scala:22)
    at org.scalatest.Transformer.apply(Transformer.scala:20)
    at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
    at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    at org.scalatest.Suite$class.run(Suite.scala:1147)
    at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    at org.scalatest.FunSuite.run(FunSuite.scala:1560)
    at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1346)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1340)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
    at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1506)
    at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
    at org.scalatest.tools.Runner$.run(Runner.scala:850)
    at org.scalatest.tools.Runner.run(Runner.scala)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
    ... 80 more
Caused by: MetaException(message:null)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:84)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:9138)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:170)
    at shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:96)
    ... 85 more
Caused by: java.lang.NullPointerException
    at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:318)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:865)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:347)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1816)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
    at org.datanucleus.store.query.Query.execute(Query.java:1726)
    at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:374)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:9490)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:9474)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:9431)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:9416)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97)
    at com.sun.proxy.$Proxy22.verifySchema(Unknown Source)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:700)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:692)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:779)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:540)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
    ... 89 more
19/05/08 14:58:52 ERROR SessionState: Error setting up authorization: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
shadehive.org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:961)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:921)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1890)
    at shadehive.org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:134)
    at shadehive.org.apache.hive.service.cli.CLIService.init(CLIService.java:118)
    at shadehive.org.apache.hive.service.cli.thrift.EmbeddedThriftBinaryCLIService.init(EmbeddedThriftBinaryCLIService.java:63)
    at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:282)
    at shadehive.org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
    at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39)
    at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
    at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2301)
    at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2287)
    at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2038)
    at org.apache.commons.dbcp2.BasicDataSource.getLogWriter(BasicDataSource.java:1588)
    at org.apache.commons.dbcp2.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:588)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:333)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:340)
    at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.getConnector(HS2JDBCWrapper.scala)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$0(HiveWarehouseSessionImpl.java:48)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeUpdate(HiveWarehouseSessionImpl.java:75)
    at com.hortonworks.spark.sql.hive.llap.CreateTableBuilder.create(CreateTableBuilder.java:79)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply$mcV$sp(CatalogTest.scala:19)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    at org.scalatest.Transformer.apply(Transformer.scala:22)
    at org.scalatest.Transformer.apply(Transformer.scala:20)
    at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
    at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    at org.scalatest.Suite$class.run(Suite.scala:1147)
    at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    at org.scalatest.FunSuite.run(FunSuite.scala:1560)
    at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1346)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1340)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
    at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1506)
    at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
    at org.scalatest.tools.Runner$.run(Runner.scala:850)
    at org.scalatest.tools.Runner.run(Runner.scala)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
Caused by: shadehive.org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:299)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:464)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:393)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:380)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:351)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:957)
    ... 66 more
Caused by: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:86)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4656)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4724)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4704)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4995)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:311)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:294)
    ... 71 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
    ... 80 more
Caused by: MetaException(message:null)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:84)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:9138)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:170)
    at shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:96)
    ... 85 more
Caused by: java.lang.NullPointerException
    at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:318)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:865)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:347)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1816)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
    at org.datanucleus.store.query.Query.execute(Query.java:1726)
    at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:374)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:9490)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:9474)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:9431)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:9416)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97)
    at com.sun.proxy.$Proxy22.verifySchema(Unknown Source)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:700)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:692)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:779)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:540)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
    ... 89 more

Error applying authorization policy on hive configuration: shadehive.org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
java.lang.RuntimeException: Error applying authorization policy on hive configuration: shadehive.org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hive.service.cli.CLIService.init(CLIService.java:121)
    at shadehive.org.apache.hive.service.cli.thrift.EmbeddedThriftBinaryCLIService.init(EmbeddedThriftBinaryCLIService.java:63)
    at shadehive.org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:282)
    at shadehive.org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
    at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:39)
    at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
    at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2301)
    at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2287)
    at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2038)
    at org.apache.commons.dbcp2.BasicDataSource.getLogWriter(BasicDataSource.java:1588)
    at org.apache.commons.dbcp2.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:588)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:333)
    at com.hortonworks.spark.sql.hive.llap.JDBCWrapper.getConnector(HS2JDBCWrapper.scala:340)
    at com.hortonworks.spark.sql.hive.llap.DefaultJDBCWrapper.getConnector(HS2JDBCWrapper.scala)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.lambda$new$0(HiveWarehouseSessionImpl.java:48)
    at com.hortonworks.spark.sql.hive.llap.HiveWarehouseSessionImpl.executeUpdate(HiveWarehouseSessionImpl.java:75)
    at com.hortonworks.spark.sql.hive.llap.CreateTableBuilder.create(CreateTableBuilder.java:79)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply$mcV$sp(CatalogTest.scala:19)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at com.jdbm.test.CatalogTest$$anonfun$1.apply(CatalogTest.scala:8)
    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    at org.scalatest.Transformer.apply(Transformer.scala:22)
    at org.scalatest.Transformer.apply(Transformer.scala:20)
    at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
    at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    at org.scalatest.Suite$class.run(Suite.scala:1147)
    at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    at org.scalatest.FunSuite.run(FunSuite.scala:1560)
    at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1346)
    at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1340)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
    at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
    at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1506)
    at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
    at org.scalatest.tools.Runner$.run(Runner.scala:850)
    at org.scalatest.tools.Runner.run(Runner.scala)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:131)
    at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
Caused by: java.lang.RuntimeException: shadehive.org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:929)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.applyAuthorizationPolicy(SessionState.java:1890)
    at shadehive.org.apache.hive.service.cli.CLIService.applyAuthorizationConfigPolicy(CLIService.java:134)
    at shadehive.org.apache.hive.service.cli.CLIService.init(CLIService.java:118)
    ... 62 more
Caused by: shadehive.org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:961)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:921)
    ... 65 more
Caused by: shadehive.org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:299)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:464)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:393)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:380)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:351)
    at shadehive.org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:957)
    ... 66 more
Caused by: java.lang.RuntimeException: Unable to instantiate shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:86)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4656)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4724)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4704)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4995)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:311)
    at shadehive.org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:294)
    ... 71 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at shadehive.org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
    ... 80 more
Caused by: MetaException(message:null)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:84)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:9138)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:170)
    at shadehive.org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:96)
    ... 85 more
Caused by: java.lang.NullPointerException
    at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:318)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:865)
    at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:347)
    at org.datanucleus.store.query.Query.executeQuery(Query.java:1816)
    at org.datanucleus.store.query.Query.executeWithArray(Query.java:1744)
    at org.datanucleus.store.query.Query.execute(Query.java:1726)
    at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:374)
    at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:9490)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:9474)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:9431)
    at shadehive.org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:9416)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97)
    at com.sun.proxy.$Proxy22.verifySchema(Unknown Source)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:700)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:692)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:779)
    at shadehive.org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:540)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
    at shadehive.org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80)
    ... 89 more

19/05/08 14:58:52 INFO SparkContext: Invoking stop() from shutdown hook
19/05/08 14:58:52 INFO SparkUI: Stopped Spark web UI at http://10.33.146.118:4040
19/05/08 14:58:52 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/08 14:58:52 INFO MemoryStore: MemoryStore cleared
19/05/08 14:58:52 INFO BlockManager: BlockManager stopped
19/05/08 14:58:52 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/08 14:58:52 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/08 14:58:52 INFO SparkContext: Successfully stopped SparkContext
19/05/08 14:58:52 INFO ShutdownHookManager: Shutdown hook called
19/05/08 14:58:52 INFO ShutdownHookManager: Deleting directory /private/var/folders/kq/kczs__gd0y741jdzq492xxj40000gp/T/spark-77f66741-849d-49db-b8e9-588791d213db

Has anybody tried to execute unit test with the new connector in embedded mode?

Thanks,

1 REPLY 1
Highlighted

Re: Error creating a table using the hive warehouse connector and spark

New Contributor

Facing the same problem. Pepito how did you solve yours?