Support Questions
Find answers, ask questions, and share your expertise

unable to create DelimitedInputWriter object in hcatalog hive streaming

unable to create DelimitedInputWriter object in hcatalog hive streaming

Writing hive streaming with hcatalog inside spark, but cant instantiate DelimitedInputWriter object getting run time exception:

17/12/23 20:32:13 INFO metastore: Connected to metastore. 17/12/23 20:32:13 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: interface org.apache.hive.hcatalog.common.HiveClientCache$ICacheableMetaStoreClient is not visible from class loader at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2263) at com.google.common.cache.LocalCache.get(LocalCache.java:4000) at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4789) at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:227) at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:202) at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558) at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.getMetaStoreClient(HiveEndPoint.java:448) at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:274) at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:243) at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:180) at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:157) at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:110)

sample code:

this.endPt = new HiveEndPoint("thrift://"+hostname+":9083", dbname, this.tablename, null); this.connection = endPt.newConnection(false); this.writer = new DelimitedInputWriter(columns, ",", endPt); int currentBatchSize = 100; int maxBatchGroups = 100; this.txnBatch = connection.fetchTransactionBatch(maxBatchGroups, writer);

Any suggestion ?

is it the wrong dependencies causing this ?