Support Questions

Find answers, ask questions, and share your expertise

Getting error while loading Druid over Hive

avatar
Explorer

Hi All,

I am testing Druid with Hive2 and managed to push sample data (1.5M aggregated rows) the way described, but when I try to build a dataset with 192M rows, I got error ArrayIndexOutOfBoundsException at io.DruidRecordWriter. I searched Hive Jira and it is a bug that is resolved, but I understand this is not applied on HDP 2.6.2 that I am using. Is there any workaround?

Thanks

1 ACCEPTED SOLUTION

avatar
Expert Contributor

The only fix is to apply the patch and replace the druid-hive-handler jar. It is only one jar that needs to be replaced. Otherwise 2.6.3 will have the fix if you want to wait. Sorry

View solution in original post

7 REPLIES 7

avatar
Expert Contributor

The only fix is to apply the patch and replace the druid-hive-handler jar. It is only one jar that needs to be replaced. Otherwise 2.6.3 will have the fix if you want to wait. Sorry

avatar
Explorer

I replaced every hive-druid-handler*.jar in my cluster with the new version, but now I am getting "org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException: cache" error. I think I will have to wait for 2.6.3 and hope it will work.

avatar
Expert Contributor

can you please past the log error stack?

avatar
Explorer

Thanks @Slim. I'm attaching one task's log. hive-20171002172650-1672cd6f-dec7-46ca-bb45-f52491.txt.

avatar
Expert Contributor

seems like you are compiling/building with the wrong druid version. Can you please explain how are you building this? are you using druid 0.10.1 or previous release?

avatar
Explorer

@Slim

I am using HDP 2.6.2 and the installed version of Druid is 0.9.2. Is there any way to upgrade Druid while bypassing Ambari?

avatar
Master Mentor

@Burak Bicen

I will suggest opening a separate thread will be more useful. As this is slightly unrelated question to this thread. Even though the topic is related to Druid. That way you might get more accurate response.