Member since 
    
	
		
		
		09-18-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                3274
            
            
                Posts
            
        
                1159
            
            
                Kudos Received
            
        
                426
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2553 | 11-01-2016 05:43 PM | |
| 8436 | 11-01-2016 05:36 PM | |
| 4845 | 07-01-2016 03:20 PM | |
| 8138 | 05-25-2016 11:36 AM | |
| 4285 | 05-24-2016 05:27 PM | 
			
    
	
		
		
		10-13-2021
	
		
		04:17 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Do you have any reference on how to enable ranger for kafka sitting in a separate cluster in CDP? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-11-2021
	
		
		08:32 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 STEP 1: 
Get certificate from ambari-server 
echo | openssl s_client -showcerts -connect <AMBARI_HOst>:<AMBARI_HTTPs_PORT> 2>&1 | sed --quiet '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > /tmp/ambari_certificate.cr 
STEP 2: 
Get path of ambari trustore and truststore password from Ambari properties 
cat /etc/ambari-server/conf/ambari.properties |grep truststore 
As per your ambari.properties below is the path and password :-
ssl.trustStore.password=refer from ambari.property file
ssl.trustStore.path=/etc/ambari-server/conf/ambari-server-truststore
STEP 3: 
keytool -importcert -file /tmp/ambari_certificate.crt -keystore <keystore-path> 
STEP 4: 
ambari-server restart 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-11-2021
	
		
		08:27 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 To resolve the issue, import the Ambari certificates to the Ambari truststore. To import the Ambari certificates, do the following: 
STEP 1: 
Get certificate from ambari-server 
echo | openssl s_client -showcerts -connect <AMBARI_HOst>:<AMBARI_HTTPs_PORT> 2>&1 | sed --quiet '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > /tmp/ambari_certificate.cr 
STEP 2: 
Get path of ambari trustore and truststore password from Ambari properties 
cat /etc/ambari-server/conf/ambari.properties |grep truststore 
As per your ambari.properties below is the path and password :-
ssl.trustStore.password=refer from ambari.property file
ssl.trustStore.path=/etc/ambari-server/conf/ambari-server-truststore
STEP 3: 
keytool -importcert -file /tmp/ambari_certificate.crt -keystore <keystore-path> 
STEP 4: 
ambari-server restart
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-27-2021
	
		
		09:38 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello @pradeep_tp,     Please restart Ambari Metrics from Ambari. Sometimes due to cache/network issues also metrics get unable but restart sometimes fixes the issue.  Please try and let me know how it goes!     //BR  Roy       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-29-2021
	
		
		02:37 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							   @Onedile wrote:   Yes this is possible.      You need to kinit with the username that has been granted access to the SQL server DB and tables.  integrated security passes your credentials to the SQL server using kerberos     "jdbc:sqlserver://sername.domain.co.za:1433;integratedSecurity=true;databaseName=SCHEMA;authenticationScheme=JavaKerberos;"     This worked for me.      It doesn't work, it's still facing issue with the latest MSSQL JDBC driver as the Kerberos tokens are lost when the mappers spawn (as the YARN transitions the job to its internal security subsystem)  21/05/29 19:00:40 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1616335290043_2743822
21/05/29 19:00:40 INFO mapreduce.JobSubmitter: Executing with tokens: [Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for c795701: HDFS_DELEGATION_TOKEN owner=c795701@XX.XXXX.XXXXXXX.COM, renewer=yarn, realUser=, issueDate=1622314832608, maxDate=1622919632608, sequenceNumber=29194128, masterKeyId=1856)]
21/05/29 19:01:15 INFO mapreduce.Job: Task Id : attempt_1616335290043_2743822_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Integrated authentication failed. ClientConnectionId:53879236-81e7-4fc6-88b9-c7118c02e7be
Caused by: java.security.PrivilegedActionException: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)   Use the jtds driver as recommended here     
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-24-2021
	
		
		01:41 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I was just able to confirm that the update command listed is Postgresql database flavor. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-08-2020
	
		
		01:49 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,    I have permenantly deleted the data.  Is this anyway we can recover the data?       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-07-2020
	
		
		11:36 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 2020 Update, what are the preferred data quality tools compatible with CDH for Hive,Hbase and Solr? Our team is looking at Apache Griffin.      Regards,   Nithya Koka 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-01-2020
	
		
		03:47 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I'm trying to run a dag with airflow 1.10.12 and HDP 3.0.0  when i run the dag it gets stuck in ```Connecting to jdbc:hive2://[Server2_FQDN]:2181,[Server1_FQDN]:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2```  when i run ```beeline -u "jdbc:hive2://[Server1_FQDN]:2181,[Server2_FQDN]:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2"``` from shell, it connect to hive with no problem.  I've also made a connection like this  ```  Conn Id *  hive_jdbc  -------------  Conn Type  -------------  Connection URL  jdbc:hive2://centosserver.son.ir:2181,centosclient.son.ir:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2  -------------  Login  hive  -------------  Password  ******  -------------  Driver Path  /usr/hdp/3.0.0.0-1634/hive/jdbc/hive-jdbc-3.1.0.3.0.0.0-1634-standalone.jar  -------------  Driver Class  org.apache.hive.jdbc.HiveDriver  ```  and I'm not using kerberos  I've also added ```hive.security.authorization.sqlstd.confwhitelist.append``` in the ambari ```Custom hive-site```  ```  radoop\.operation\.id|mapred\.job\.name||airflow\.ctx\.dag_id|airflow\.ctx\.task_id|airflow\.ctx\.execution_date|airflow\.ctx\.dag_run_id|airflow\.ctx\.dag_owner|airflow\.ctx\.dag_email|hive\.warehouse\.subdir\.inherit\.perms|hive\.exec\.max\.dynamic\.partitions|hive\.exec\.max\.dynamic\.partitions\.pernode|spark\.app\.name  ```  any suggestions? I'm desperate, I've tried every way that i know but still nothing  @nsabharwal @agillan @msumbul1 @deepesh1  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-13-2020
	
		
		09:08 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @torafca5        Could you please try downloading the jar from the below link,     http://www.congiu.net/hive-json-serde/1.3.8/hdp23/json-serde-1.3.8-jar-with-dependencies.jar     Once the jar is downloaded, move the jar to the location /usr/hdp/3.0.1.0-187/hive/lib.  Please place the jar on all the nodes hosting Hive services.  Also, please make sure you are not using LLAP(HiveserverInteractive) to connect to the hive.  add jar command does not work with LLAP.     implementing the above recommendation should help overcome this issue. 
						
					
					... View more