Support Questions

Find answers, ask questions, and share your expertise

Error: java.io.IOException: java.lang.IllegalArgumentException: bucketId out of range: -1 (state=,code=0)

avatar
New Contributor

when i have tried to push data from pyspark to hive..table is creating successfully but when i execute select query from hive on the table i am getting following error and in result getting empty columns
Error: java.io.IOException: java.lang.IllegalArgumentException: bucketId out of range: -1 (state=,code=0)
I am using Apache Ambari version 2.7.0.0
here are the properties of hive table:

+------------------------------------+----------------------------------------------------+
|             prpt_name              |                     prpt_value                     |
+------------------------------------+----------------------------------------------------+
| bucketing_version                  | 2                                                  |
| numFiles                           | 56                                                 |
| spark.sql.create.version           | 2.3.1.3.0.1.0-187                                  |
| spark.sql.sources.provider         | orc                                                |
| spark.sql.sources.schema.numParts  | 1                                                  |
| spark.sql.sources.schema.part.0    | {"type":"struct","fields":[{"name":"SERVICENAME","type":"string","nullable":true,"metadata":{}}]} |
| totalSize                          | 16675                                              |
| transactional                      | true                                               |
| transient_lastDdlTime              | 1539170912                                         |
+------------------------------------+----------------------------------------------------+


6 REPLIES 6

avatar
Expert Contributor

Suspecting HIVE-20593 should help to fix this issue.

avatar
New Contributor

HIVE-20593 it has issue type Bug but I dont understand is this bug fixed? I did not find any soluition for the above issue.

> Load Data for partitioned ACID tables fails with bucketId out of range: -1
> --------------------------------------------------------------------------
>
>                 Key: HIVE-20593
>                 URL: https://issues.apache.org/jira/browse/HIVE-20593
>             Project: Hive
>          Issue Type: Bug
>          Components: Transactions
>    Affects Versions: 3.1.0
>            Reporter: Deepak Jaiswal
>            Assignee: Deepak Jaiswal
>            Priority: Major
>         Attachments: HIVE-20593.1.patch, HIVE-20593.2.patch, HIVE-20593.3.patch
>
>
> Load data for ACID tables is failing to load ORC files when it is converted to IAS job.
>  
> The tempTblObj is inherited from target table. However, the only table property which
needs to be inherited is bucketing version. Properties like transactional etc should be ignored.


avatar
Expert Contributor

HIVE-20593 is fixed in Hive Opensource community, I am suspecting implementing this code fix in your current cluster will help to resolve your issue.

You can contact hortonworks support to validate my suspect & get hotfix.

avatar
Explorer

Hi Sahina just wondering if you had any solution for this issue.

avatar

Hi

I faced the same issue after setting the following properties, it is working fine.

set hive.mapred.mode=nonstrict;

set hive.optimize.ppd=true;

set hive.optimize.index.filter=true;

set hive.tez.bucket.pruning=true;

set hive.explain.user=false;

set hive.fetch.task.conversion=none;

set hive.support.concurrency=true;

set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;

avatar

I faced the same issue after setting the following properties, it is working fine.

set hive.mapred.mode=nonstrict; set hive.optimize.ppd=true; set hive.optimize.index.filter=true; set hive.tez.bucket.pruning=true; set hive.explain.user=false; set hive.fetch.task.conversion=none; set hive.support.concurrency=true; set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;