On HDP 3.0.1 I have druid backed by derby and hive fails it could not load mysql jdbc driver.
Create hive table back by druid:
CREATE TABLE airline_druid STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ( "druid.segment.granularity" = "MONTH", "druid.query.granularity" = "DAY") AS SELECT cast(Year || '-' || Month || '-' || DayofMonth as timestamp) as `__time`, cast(Year as string) Year, cast(Month as string) Month, cast(DayofMonth as string) DayofMonth, cast(UniqueCarrier as string) UniqueCarrier, cast(Origin as string) Origin, cast(Dest as string) Dest, cast(CancellationCode as string) CancellationCode, Distance, Cancelled FROM airline_raw limit 100
I get the following error:
ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException: java.sql.SQLException: Cannot load JDBC driver class 'com.mysql.jdbc.Driver' INFO : Completed executing command(queryId=hive_20181108214414_80b7f4e8-3b79-4623-bb6b-293f5c7fe2b2); Time taken: 241.156 seconds Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException: java.sql.SQLException: Cannot load JDBC driver class 'com.mysql.jdbc.Driver' (state=08S01,code=1)
I did find this HCC Post: https://community.hortonworks.com/content/supportkb/155363/error-failed-execution-error-return-code-...
What is weird is the error has asking for mysql driver. The druid metadata is backed by derby. In fact I did set those params
set hive.druid.metadata.username=druid; set hive.druid.metadata.password=*****; set hive.druid.metadata.uri=jdbc:derby://localhost:1527/druid
You see that exception because MySql is the default metadata Driver for Druid-Hive integration. you need to set hive.druid.metadata.db.type=derby
Also i want to make 2 points.
First Derby is only used for integration testing and will only work on one host and did not get any chance to test is outside of that scope.
Second Please keep in mind that Hive is not case sensitive and lowers all your columns name while Druid is case sensitive thus recommend to lower case all the column names.