Support Questions

Find answers, ask questions, and share your expertise

Not able to drop hive table error is org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDOException

avatar
Guru

Hello,

When I am trying to drop an external table in hive then I am getting foloowing error, so can someone please help me to drop thsi table.

drop table hst_staging;

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDOException: Exception thrown when executing query : SELECT 'org.apache.hadoop.hive.metastore.model.MStorageDescriptor' AS `NUCLEUS_TYPE`,`A0`.`INPUT_FORMAT`,`A0`.`IS_COMPRESSED`,`A0`.`IS_STOREDASSUBDIRECTORIES`,`A0`.`LOCATION`,`A0`.`NUM_BUCKETS`,`A0`.`OUTPUT_FORMAT`,`A0`.`SD_ID` FROM `SDS` `A0` WHERE `A0`.`CD_ID` = ? LIMIT 0,1

at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:677)

at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:388)

at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:225)

at org.apache.hadoop.hive.metastore.ObjectStore.listStorageDescriptorsWithCD(ObjectStore.java:3177)

at org.apache.hadoop.hive.metastore.ObjectStore.removeUnusedColumnDescriptor(ObjectStore.java:3123)

at org.apache.hadoop.hive.metastore.ObjectStore.dropPartitions(ObjectStore.java:1803)

at sun.reflect.GeneratedMethodAccessor370.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:103)

at com.sun.proxy.$Proxy8.dropPartitions(Unknown Source)

at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.dropPartitionsAndGetLocations(HiveMetaStore.java:1838)

at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_core(HiveMetaStore.java:1673)

at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table_with_environment_context(HiveMetaStore.java:1859)

at sun.reflect.GeneratedMethodAccessor247.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)

at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)

at com.sun.proxy.$Proxy12.drop_table_with_environment_context(Unknown Source)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.drop_table_with_environment_context(HiveMetaStoreClient.java:2190)

at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.drop_table_with_environment_context(SessionHiveMetaStoreClient.java:120)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:1034)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:970)

at sun.reflect.GeneratedMethodAccessor228.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:178)

at com.sun.proxy.$Proxy13.dropTable(Unknown Source)

at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1108)

at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1045)

at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:4084)

at org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3940)

at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:341)

at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)

at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)

at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1748)

at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1494)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1291)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1158)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1153)

at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:197)

at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76)

at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)

at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264)

at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

NestedThrowablesStackTrace:

java.sql.BatchUpdateException: Cannot delete or update a parent row: a foreign key constraint fails ("hive"."COLUMNS_V2", CONSTRAINT "COLUMNS_V2_FK1" FOREIGN KEY ("CD_ID") REFERENCES "CDS" ("CD_ID"))

at sun.reflect.GeneratedConstructorAccessor300.newInstance(Unknown Source)

6 REPLIES 6

avatar
Master Mentor

@Saurabh

You will need to disable the Foreign key before you can delete the tables as it will have a cascading effect

"Cannot delete or update a parent row: a foreign key constraint fails"

SET FOREIGN_KEY_CHECKS=0;-- to disable them
SET FOREIGN_KEY_CHECKS=1;-- to re-enable them

Hope that helps

avatar
Guru

Thanks @Geoffrey Shelton Okot You meant to say I need to disable in mysql and then drop the table ?

If yes then I tried but failing with same error:

mysql> SET FOREIGN_KEY_CHECKS=0;

Query OK, 0 rows affected (0.00 sec)

hive> drop table i0177a_cus_hdr_hst_staging;

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDOException: Exception thrown when executing query : SELECT 'org.apache.hadoop.hive.metastore.model.MStorageDescriptor' AS `NUCLEUS_TYPE`,`A0`.`INPUT_FORMAT`,`A0`.`IS_COMPRESSED`,`A0`.`IS_STOREDASSUBDIRECTORIES`,`A0`.`LOCATION`,`A0`.`NUM_BUCKETS`,`A0`.`OUTPUT_FORMAT`,`A0`.`SD_ID` FROM `SDS` `A0` WHERE `A0`.`CD_ID` = ? LIMIT 0,1

at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:677)

at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:388)

avatar
New Contributor

@SK1  Have you got the answer how to drop a table when this error pops out ?

 

avatar
Community Manager

@Aayuah, as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.



Regards,

Vidya Sargur,
Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar
Guru

Also I tried to set global and then restarted hive metastore. But same error.

mysql> SET GLOBAL FOREIGN_KEY_CHECKS=0;

Query OK, 0 rows affected (0.00 sec)

mysql> SELECT @@GLOBAL.foreign_key_checks, @@SESSION.foreign_key_checks;

+-----------------------------+------------------------------+

| @@GLOBAL.foreign_key_checks | @@SESSION.foreign_key_checks |

+-----------------------------+------------------------------+

| 0 |1 |

+-----------------------------+------------------------------+

1 row in set (0.00 sec)

drop table i0177a_cus_hdr_hst_staging;

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:javax.jdo.JDOException: Exception thrown when executing query : SELECT 'org.apache.hadoop.hive.metastore.model.MStorageDescriptor' AS `NUCLEUS_TYPE`,`A0`.`INPUT_FORMAT`,`A0`.`IS_COMPRESSED`,`A0`.`IS_STOREDASSUBDIRECTORIES`,`A0`.`LOCATION`,`A0`.`NUM_BUCKETS`,`A0`.`OUTPUT_FORMAT`,`A0`.`SD_ID` FROM `SDS` `A0` WHERE `A0`.`CD_ID` = ? LIMIT 0,1

at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:677)

avatar
Master Mentor

@Saurabh

Can you describe table i0177a_cus_hdr_hst_staging?

Can you share the DDL used to create this external table?

Under your current (possibly flawed) design, you must delete the row out of the parent table before you can delete the row in the i0177a_cus_hdr_hst_staging table that it references.