- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
External table not loading data after Alter
Created ‎08-18-2020 09:25 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I created a external table with 'A' as string but stores a decimal value. Loaded data into the table then during report creation I realized my mistake and altered the column 'A' to double. Now I want to access that data but get the below error. I tried
set hive.msck.path.validation=ignore;
msck repair table tablename;
But still get the below error.
Bad status for request TFetchResultsReq(fetchType=0, operationHandle=TOperationHandle(hasResultSet=True, modifiedRowCount=None, operationType=0, operationId=THandleIdentifier(secret='\nv\xe01\x14\x7fD\xd2\xb0\xadk\xd1\xbe\x9a\xd2\xd6', guid='\x19\xc2\x13b\x82\x96O\xd1\x93\x04\xf7\xdd\xc5\xcfz\xeb')), orientation=4, maxRows=100): TFetchResultsResp(status=TStatus(errorCode=0, errorMessage='java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.DoubleWritable', sqlState=None, infoMessages=['*org.apache.hive.service.cli.HiveSQLException:java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.DoubleWritable:14:13', 'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:496', 'org.apache.hive.service.cli.operation.OperationManager:getOperationNextRowSet:OperationManager.java:297', 'org.apache.hive.service.cli.session.HiveSessionImpl:fetchResults:HiveSessionImpl.java:868', 'org.apache.hive.service.cli.CLIService:fetchResults:CLIService.java:507', 'org.apache.hive.service.cli.thrift.ThriftCLIService:FetchResults:ThriftCLIService.java:708', 'org.apache.hive.service.rpc.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1717', 'org.apache.hive.service.rpc.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1702', 'org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39', 'org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39', 'org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor:process:HadoopThriftAuthBridge.java:605', 'org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286', 'java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1149', 'java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:624', 'java.lang.Thread:run:Thread.java:748', '*java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.DoubleWritable:16:2', 'org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:164', 'org.apache.hadoop.hive.ql.Driver:getResults:Driver.java:2227', 'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:491', '*org.apache.hadoop.hive.ql.metadata.HiveException:java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.DoubleWritable:25:9', 'org.apache.hadoop.hive.ql.exec.ListSinkOperator:process:ListSinkOperator.java:97', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:882', 'org.apache.hadoop.hive.ql.exec.LimitOperator:process:LimitOperator.java:63', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:882', 'org.apache.hadoop.hive.ql.exec.SelectOperator:process:SelectOperator.java:95', 'org.apache.hadoop.hive.ql.exec.Operator:forward:Operator.java:882', 'org.apache.hadoop.hive.ql.exec.TableScanOperator:process:TableScanOperator.java:130', 'org.apache.hadoop.hive.ql.exec.FetchOperator:pushRow:FetchOperator.java:438', 'org.apache.hadoop.hive.ql.exec.FetchOperator:pushRow:FetchOperator.java:430', 'org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:146', '*java.lang.ClassCastException:org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.DoubleWritable:30:5', 'org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableDoubleObjectInspector:get:WritableDoubleObjectInspector.java:36', 'org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableDoubleObjectInspector:getPrimitiveJavaObject:WritableDoubleObjectInspector.java:46', 'org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorUtils:copyToStandardObject:ObjectInspectorUtils.java:412', 'org.apache.hadoop.hive.serde2.SerDeUtils:toThriftPayload:SerDeUtils.java:170', 'org.apache.hadoop.hive.serde2.thrift.ThriftFormatter:convert:ThriftFormatter.java:49', 'org.apache.hadoop.hive.ql.exec.ListSinkOperator:process:ListSinkOperator.java:94'], statusCode=3), results=None, hasMoreRows=None)
How can I fix my problem?
Thanks
Created ‎08-21-2020 01:06 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The error is here : org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.DoubleWritable
This states you are trying to cast Text to Double which is not possible. Are you sure that the column is marked as String? String to double casting is always possible. Please double check.
Another solution is to drop and recreate the table with correct data types. Since, its an external table there is no data lose anyway.
Hope this helps. If the comment helps you to find a solution or move forward, please accept it as a solution for other community members.
Created ‎08-21-2020 01:06 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The error is here : org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.DoubleWritable
This states you are trying to cast Text to Double which is not possible. Are you sure that the column is marked as String? String to double casting is always possible. Please double check.
Another solution is to drop and recreate the table with correct data types. Since, its an external table there is no data lose anyway.
Hope this helps. If the comment helps you to find a solution or move forward, please accept it as a solution for other community members.
