Member since 
    
	
		
		
		09-15-2020
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                243
            
            
                Posts
            
        
                19
            
            
                Kudos Received
            
        
                7
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2779 | 11-08-2024 08:45 PM | |
| 1550 | 02-22-2024 02:53 AM | |
| 839 | 02-21-2024 06:55 AM | |
| 1594 | 02-20-2024 09:20 AM | |
| 947 | 02-15-2024 08:05 AM | 
			
    
	
		
		
		02-23-2024
	
		
		10:21 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @yo_leo   As per document, it is mentioned that you should have at most 50 executors per coordinator. Which is the recommended maximum value.  However this may vary as per the workload and complexity of the queries executed in Impala  https://impala.apache.org/docs/build/asf-site-html/topics/impala_scaling_limits.html  https://impala.apache.org/docs/build/asf-site-html/topics/impala_dedicated_coordinator.html    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-23-2024
	
		
		06:23 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Kropiciel   To connect ODBC driver with multiple HiveServers, you can configure High Availability for Hive using LB, refer below doc for same. The use the LB hostname and port in the connection string  https://docs.cloudera.com/cdp-private-cloud-base/7.1.9/configuring-apache-hive/topics/hive-ha-loadbalancer.html  Let us know if this helps.     As for the python connection use below code and check how it goes.     CODE:  ---------  from sqlalchemy import create_engine  #Input Information  host = hive_hostname  port = 10000  schema = schema_name  table = table_name     #Execution  engine = create_engine(f'hive://{host}:{port}/{schema}')  engine.execute(QUERY) 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-23-2024
	
		
		05:44 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Kalpit  can you confirm the version of the hadoop cluster 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-22-2024
	
		
		02:53 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi  @Shivakuk  As I test Sentry does not support DROP and DELETE privilege.  However if you want to remove the DROP access from Admin user, then you have first remove ALL privileges and just provide  SELECT and INSERT privileges to the user  NOTE: The DELETE, UPDATE, and UPSERT operations require the ALL privilege on the DB/ Table/ Column. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-21-2024
	
		
		08:15 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @SDPLearner Can you please check and confirm if it works using impala-shell or not. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-21-2024
	
		
		06:55 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @wert_1311   Thank you for reaching out to Cloudera Community, Cloudera provide an observability tool called "Observability" which help you monitor jobs within a cluster.  You can refer below documentation on how this tool can be configured and used.  https://docs.cloudera.com/observability/cloud/overview/topics/obs-understanding-observ.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-15-2024
	
		
		08:27 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @drgenious   Please share us the Sqoop console logs with --verbose logs and the yarn application logs for review. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-15-2024
	
		
		08:05 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @Hae   You may refer below document to generate authentication token and access HUE API  https://docs.cloudera.com/data-warehouse/cloud/managing-warehouses/topics/dw-jwt-generate-token-later.html  https://docs.gethue.com/developer/api/rest/  Let us know if this helps! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-15-2024
	
		
		07:54 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @drgenious   Login to HUE Web UI   Go to Documents Page  Right Click the Document associated with your Workflow, Use Download option to export the workflow 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-13-2024
	
		
		11:19 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hi @ipson   Please check table schema and verify if ('external.table.purge'='true') property is set using S  This property controls how DROP TABLE and ALTER TABLE works.  In case it is not present, then add this property for the required table( issue the ALTER TABLE stmt twice)    ALTER TABLE t SET TBLPROPERTIES ('external.table.purge'='true');
ALTER TABLE t SET TBLPROPERTIES ('external.table.purge'='true');    The first invocation sets the property in HMS. The second one persists it to Iceberg metadata.  Let me know if this helps. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













