Member since 
    
	
		
		
		03-06-2020
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                406
            
            
                Posts
            
        
                56
            
            
                Kudos Received
            
        
                37
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 372 | 08-29-2025 12:27 AM | |
| 1018 | 11-21-2024 10:40 PM | |
| 976 | 11-21-2024 10:12 PM | |
| 3026 | 07-23-2024 10:52 PM | |
| 2135 | 05-16-2024 12:27 AM | 
			
    
	
		
		
		05-30-2024
	
		
		10:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @jayes May i know how you are exporting the table into HDFS? what is the command? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-16-2024
	
		
		12:27 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Hi @d_liu_   The error shows that  problem with the network connectivity between the Hive server (ip-172-19-36-68) and the HDFS namenode (ip-172-19-36-94.ap-southeast-2.compute.internal) on port 8020.  So as per your comment above HDFS node does not exist in your cluster right? May i know how you are running this query like from beeline or Hue or from any third party tool?  Search the above host in all the configuration files(hive-site.xml,hive-env.sh,hive.metastore.uris etc..)   Try to search the above host name at client side from where you are running the query and there is a chances if you have any custom scripts or config files causing the issue, so check if you have any custom scripts to run the jobs.     Regards,  Chethan YM    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-15-2024
	
		
		02:46 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @hadoopranger   1. Verify if the JDBC connection string is valid and correct.  2. Try to use the latest JDBC driver version that is available.  3. Check the HS2 servers that you are connecting are accepting the connections and in good health.  4. Check if beeline is working or not.  5. If all correct enable the JDBC driver trace level logging and get the more details on the error.  https://docs.cloudera.com/documentation/other/connectors/hive-jdbc/2-6-15/Cloudera-JDBC-Driver-for-Apache-Hive-Install-Guide.pdf     Regards,  Chethan YM 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-15-2024
	
		
		02:34 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hello @hadoopranger   Do provide the complete error stack trace from hue log and error screenshot from Hue UI to check the same.  Try restarting impala, hue and see if any difference.  Regards,  Chethan YM 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-14-2024
	
		
		01:25 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @vlallana   Its a generic error you just need to compare the configurations/settings between prod and dev clusters and check the Impala server status and it is ready to accept the connections.  try to use the latest available driver and check if there is any network connectivity issue, if everything good enable the driver trace level logging[1] and repro the issue then review the logs for more details.  Regards,  Chethan YM  [1]. https://docs.cloudera.com/documentation/other/connectors/impala-odbc/2-6-11/Cloudera-ODBC-Driver-for-Impala-Install-Guide.pdf    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-01-2024
	
		
		05:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Anderosn   1. If the content of your flow file is too large to be inserted into a single CLOB column, you can split it into smaller chunks and insert each chunk into the database separately.  2. Instead of storing the content in a CLOB column, you can consider storing it in a BLOB (Binary Large Object) column in your database. BLOB columns can store binary data, including large files, without the size limitations of CLOB columns.   3. Store the content of the flow file in an external storage system (e.g., HDFS, Amazon S3) and then insert the reference (e.g., file path or URL) into the database. This approach can be useful if the database has limitations on the size of CLOB or BLOB columns  4. If ExecuteScript is not approved, consider using an external script or application to perform the insertion into the database. You can trigger the script or application from NiFi using ExecuteProcess or InvokeHTTP processors     Regards,  Chethan YM    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-01-2024
	
		
		04:50 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @rsheikh    Ensure that the Kerberos configuration (krb5.ini) is correctly set up on your Windows Server 2019 machine. The krb5.ini file should contain the necessary realm and KDC (Key Distribution Center) information for your Kerberos setup.  Set the java.security.auth.login.config environment variable to point to the JAAS (Java Authentication and Authorization Service) configuration file (jaas.conf). This file defines the login modules used for authentication.  Verify that the realm and principal settings in krb5.ini match the configuration of your Kerberos environment.      Regards,  Chethan YM 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-19-2024
	
		
		09:23 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @Muskan   You can set it in CM Impala advance configs ->   Impala Daemon Command Line Argument Advanced Configuration Snippet (Safety Valve)     Regards,  Chethan YM 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-19-2024
	
		
		09:16 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @MrBeasr   Review the oozie logs for this workflow if there is anything suspicious and you can paste here.  oozie job -oozie http://<oozie-server-host>:11000 -log <workflow-id>     Regards,  Chethan YM       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-18-2024
	
		
		05:39 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Ynr In the below doc refer the "Configuring Logging Options on Windows" section to understand the steps to enable trace/debug level logging.  https://docs.cloudera.com/documentation/other/connectors/impala-odbc/2-6-11/Cloudera-ODBC-Driver-for-Impala-Install-Guide.pdf     Regards,  Chethan YM 
						
					
					... View more