Created on 12-13-2024 08:06 AM - edited 12-13-2024 10:03 AM
I run the database "ASK_SELL" with 43 millions of records with size of 122 GB in Hadoop. Its run for almost 1 hour and 40 minutes and stop it with the error message bellow.
ERROR: Execute error:[Cloudera][HiveJDBCDriver](500051) ERROR processing query/statement. Error Code:0, SQL state: null, Query: SELECT 1 -- /* keep alive */, Error message from Server: Invalid SessionHandle: SessionHandle
WARNING: File deletion failed for DESTINO2.BENEF_CONC.DATA.
Whats going on ?
Created 12-13-2024 03:51 PM
@zorrofrombrasil Welcome to the Cloudera Community!
To help you get the best possible solution, I have tagged our Hive experts @ggangadharan @james_jones @Shmoo who may be able to assist you further.
Please keep us updated on your post, and we hope you find a satisfactory solution to your query.
Regards,
Diana Torres,Created 12-26-2024 08:21 PM
Created 12-27-2024 02:30 AM
@zorrofrombrasil How are you running this job? Is it through Zeppelin or any JDBC app? The reason I ask is because I see a keep-alive query "SELECT 1" being run, so the session does not go idle.
Nevertheless, this "Invalid SessionHandle" could appear if the connection has switched over to a different hiveserver2 instance. The reason could be a network glitch, or any communication drop(timeout) at the middle layer such as Load Balancer/Knox(if in use).
If we are dealing with such large data size, it is better to use a native thrift client like "beeline" that can be run in a Screen so it is not interrupted when a user closes a terminal, or shuts down the client computer. For JDBC application, make sure SocketTimeout is disabled(value = 0), and hive.server2.idle.operation.timeout & hive.server2.idle.session.timeout are set to large values like 6hrs and 8hrs respectively in Hive configuration.