12-27-2016 03:08 PM
I have an HBase table with two family columns and access this HBase Table throught Hive and Impala metadata (all columns in string format). However, from JDBC Driver when I tried to insert new row using Impala JDBC Driver I received this error:
com.cloudera.exceptions.ExceptionConverter.toSQLException: [Simba][ImpalaJDBCDriver](500352) Error getting the parameter data type: HIVE_PARAMETER_QUERY_DATA_TYPE_ERR_NON_SUPPORT_DATA_TYPE
java.sql.SQLException: [Simba][ImpalaJDBCDriver](500352) Error getting the parameter data type: HIVE_PARAMETER_QUERY_DATA_TYPE_ERR_NON_SUPPORT_DATA_TYPE
I also tried using a Hive JDBC driver using the same Java program and works fine. I would like to know how can I resolve this issue with Impala JDBC Driver, please some advice.
CDH Version: 5.8
JDBC Impala version: 126.96.36.1996 GA
12-27-2016 04:21 PM
create external table MX_PRC_AUDIT_LOG(
ID string COMMENT 'xxxxx',
DSC_API_RESOLUTOR string COMMENT 'xxxxx',
FCH_PLATFORM_TIME string COMMENT 'xxxxxx',
COD_UUID string COMMENT 'xxxxxx',
FCH_APP_AUDIT_DATE string COMMENT 'xxxxxx',
HOR_APP_AUDIT_TIME string COMMENT 'xxxxxx',
VAL_DEV_ID string COMMENT 'xxxxxx',
USR_LOGGED string COMMENT 'xxxxxxx',
VAL_OPE_PHASE string COMMENT 'xxxxxxx',
DSC_OPE_DESC string COMMENT 'xxxxxxxx',
COD_OPERATION string COMMENT 'xxxxxxx',
VAL_OPE_PARAM string COMMENT 'xxxxxxxxx',
VAL_OPE_MSG_REQUEST string COMMENT 'xxxxxxxx',
VAL_AUTH string COMMENT 'xxxxxxxx',
VAL_XSAN_DEV string COMMENT 'xxxxxxxx',
VAL_xSan_CHN string COMMENT 'xxxxxxx',
VAL_AUDIT_DIGEST string COMMENT 'xxxxxxxx'
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ( 'hbase.columns.mapping'=':key,searchData:DSC_API_RESOLUTOR,searchData:FCH_PLATFORM_TIME,operationalData:COD_UUID,operationalData:FCH_APP_AUDIT_DATE,operationalData:HOR_APP_AUDIT_TIME,operationalData:VAL_DEV_ID,operationalData:USR_LOGGED,operationalData:VAL_OPE_PHASE,operationalData:DSC_OPE_DESC,operationalData:COD_OPERATION,operationalData:VAL_OPE_PARAM,operationalData:VAL_OPE_MSG_REQUEST,operationalData:VAL_AUTH,operationalData:VAL_XSAN_DEV,operationalData:VAL_XSAN_CHN,operationalData:VAL_AUDIT_DIGEST')
TBLPROPERTIES ("hbase.table.name" = "MX_PRC_AUDIT_LOG");
12-27-2016 04:44 PM
I'm not sure if the driver can not handle correctly a SQL sentences using PreparedStatement implementation...
Is rare because I change my Java implementatation using Statement instead of PreparedStatement and work's fine, however I don't see any reference that PreparedStatement is not supported for Impala JDBC Driver...
Please could you confirm if Impala JDBC Driver 4.1 (188.8.131.526 GA) does not support PreparedStatement Implementation?
12-27-2016 05:04 PM
Thanks. The CREATE TABLE looks fine.
As far as I know the JDBC driver you are using should work with prepared statements. Please be aware, however, that the Impala server does not support prepared statements, so you may not get the benefits you are looking for.
The error indicates a type mismatch. Are you trying to insert strange or non-ASCII values? I'm just trying to think of why the JDBC driver reports a type mismatch.
12-28-2016 07:18 AM
I tried to inser "normal" values, that means ASCII without strange characters. However if you says that Impala Server does not support PreparedStatements sentences I believe that is the reason for this error. Maybe in JDBC Driver documentation will be fine include this kind of limitations.
12-28-2016 12:00 PM
The JDBC driver supports prepared statements on the client side, so they should "work" but will not provide any performance benefits. The "prepared" statement will just be sent to Impala and executed fresh every time.
Did the insert with ASCII values work?
08-28-2018 01:47 AM - edited 08-28-2018 01:53 AM
We had same problem (with Kudu) and found workaround: use method setObject instead of setString (from PreparedStatement).
This code caused error (HIVE_PARAMETER_QUERY_DATA_TYPE_ERR_NON_SUPPORT_DATA_TYPE) during statement execution:
This code works fine (and String is properly stored in Kudu)
PS. It looks like bug in Cloudera JDBC driver and should be fixed.