Member since 
    
	
		
		
		05-07-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                331
            
            
                Posts
            
        
                45
            
            
                Kudos Received
            
        
                35
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 9611 | 09-12-2018 10:09 PM | |
| 3738 | 09-10-2018 02:07 PM | |
| 11506 | 09-08-2018 05:47 AM | |
| 4089 | 09-08-2018 12:05 AM | |
| 4931 | 08-15-2018 10:44 PM | 
			
    
	
		
		
		06-02-2025
	
		
		06:55 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 It does looks like query failed with ClassCastException.     It indicates that ( org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.LongWritable ) a mismatch between the data type Hive expects and the data type it's actually encountering while processing the query.     From the Error trace , Hive treats a value as a DECIMAL(HiveDecimalWritable) but the metadata seems to be Long(LongWritable).     One possible Reason might be Schema Mismatch:  Hive table schema defines a column but the underlying data file (e.g., Parquet, ORC, ...) for that column actually contains DECIMAL Values.   To validate ,     Run DESCRIBE FORMATTED <your_table_name>;  for the table involved in the failing query.  Pay close attention to the data types of all columns, especially those that might be involved in the conversion.  Compare these Hive schema data types with the actual data types in your source data files. For example,   if you're using Parquet, use tools like parquet-tools to inspect the schema of  Parquet files.   if you're using ORC , use hive --orcfiledump to inspect the schema of orc files.    Also make sure that Serde's pointing to valid underlying file formats.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-23-2022
	
		
		03:05 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @iamfromsky as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-13-2022
	
		
		10:28 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi there,  is it dfs_datanode_data_dir_perm?     what's your previous value for it when it couldn't write?  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-11-2021
	
		
		09:35 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Later versions of hive have a "sys" DB that under the hood connects back to the hive metastore database (eg Postgres or whatever). and you can query that.  Impala seems not to be able to see this sys db though.     There is also a "information_schema" DB with a smaller and cleaner subset but it points back to sys and also not visible from impala if you do a "show databases;"     You can use "show" statements in impala-shell but I'm not sure there is a DB to through SQL at via ODBC/JDBC.      Still looking for a way to do this in impala    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		11-19-2020
	
		
		03:18 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I too have the same scenario where in the column was decimal and updated to bigint, now getting error while querying the column.  the data type on the table and the parquet file are aligned.     Error: java.io.IOException: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hive.serde2.io.HiveDecimalWritable (state=,code=0)     If you have already resolved the issue, much appreciate if you could let know what worked for you.      Thanks   Dinesh 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-04-2020
	
		
		07:30 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Could you please let us know which documentation you are talking about? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-13-2020
	
		
		12:46 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 hi @elkrish ,     Was this resolved ?? can u share if you found a solution for this issue ??    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-09-2020
	
		
		08:26 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 So Presto now supports ACID tables, but only for Hive3. However, the subdirectory exception is from a configuration on the presto client side. In the hive.properties in presto's catalog directory, add "hive.recursive-directories=true" 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-30-2020
	
		
		03:24 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 How do I check if HS2 can reach port 2181? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
         
					
				













