Member since 
    
	
		
		
		05-02-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                88
            
            
                Posts
            
        
                173
            
            
                Kudos Received
            
        
                15
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 7399 | 09-27-2017 04:21 PM | |
| 3389 | 08-17-2017 06:20 PM | |
| 3071 | 08-17-2017 05:18 PM | |
| 3645 | 08-11-2017 04:12 PM | |
| 5123 | 08-08-2017 12:43 AM | 
			
    
	
		
		
		12-01-2017
	
		
		09:44 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		10 Kudos
		
	
				
		
	
		
					
							 Short Description:  
	How to configure KNOX for Hive1 and Hive2(LLAP) in parallel.  Article  
	By default KNOX is configured for Hive1, this article will help you to configure KNOX for Hive2(LLAP).  
	Step 1. Before configuring KNOX for Hive2, you have to configure Hive2 for http mode as,  
	Go to Ambari-> Services -> Hive -> configs -> custom hive-interactive-site.xml -> add below properties,  hive.server2.thrift.http.path=cliservice
hive.server2.transport.mode=http
   
	Restart Hive service.  
	Step 2. Now configure KNOX for Hive2(LLAP) as,  
	1) Go to the below location in your KNOX server machine:-  # cd /usr/hdp/<HDP VERSION>/knox/data/services
   
	2) Copy the hive directory present in the location and rename it as llap  # cp -rp hive llap  3) Edit the services.xml and rewrite.xml as below:-  # cd llap/0.13.0/
# vim service.xml
------------
<service role="LLAP" name="llap" version="0.13.0">
    <routes>
        <route path="/llap"/>
    </routes>
    <dispatch classname="org.apache.hadoop.gateway.hive.HiveDispatch" ha-classname="org.apache.hadoop.gateway.hive.HiveHaDispatch"/>
</service>
   # vim rewrite.xml
------------
<rules>
    <rule dir="IN" name="LLAP/llap/inbound" pattern="*://*:*/**/llap">
        <rewrite template="{$serviceUrl[LLAP]}"/>
    </rule>
</rules>  4) Go to Ambari -> KNOX -> configs -> Edit the Advanced topology of your KNOX service and add LLAP service as,  <service>
	<role>LLAP</role>
	<url>http://<LLAP server hostname>:<HTTP PORT NUMBER>/{{hive_http_path}}</url>
</service>  Example:  <url>http://abcd.example.com:10501/cliservice</url>  5) Restart Knox service. 
   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		10-03-2017
	
		
		03:12 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Ashnee Sharma   You have to install this driver on Client side and use is to connect to Hive with all the details.  Also Check this link as well,  https://community.hortonworks.com/questions/15667/windows-hive-connection-issue-through-odbc-using-h.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-27-2017
	
		
		04:21 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Ashnee Sharma  You can find the links for ODBC drivers at,  https://hortonworks.com/downloads/ 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-26-2017
	
		
		04:02 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							@Ashnee Sharma What version of Hive ODBC driver you are using?  Are you using the In-built Hortonworks Hive Hadoop driver or DSN driver installed.  Default In-built Hortonworks Hive Hadoop driver in Microsoft will not work. You have to use Other ODBC sources -> Hive DSN driver. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-17-2017
	
		
		06:20 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @arjun more
  Could you please check the properties,  User object class* : try changing it from person to user.  Group member attribute* : try changing it from memberof to memberid  Distinguished name attribute* : try changing it from dn to distinguishedName  These parameters are depends on your environment LDAP. Please check the values of these once again and try to sync up. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-17-2017
	
		
		05:18 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @arjun more
  Please check below URL which has similar concerns.  https://community.hortonworks.com/questions/106430/is-there-any-way-to-get-the-list-of-user-who-submi.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-16-2017
	
		
		06:40 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							@Sami Ahmad Check below article for this,  https://community.hortonworks.com/articles/56704/secure-kafka-java-producer-with-kerberos.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-16-2017
	
		
		06:33 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Sami Ahmad  Please check below URL's  https://community.hortonworks.com/questions/78843/problems-with-kafka-scripts-after-enabled-kerberos.html  https://community.hortonworks.com/content/supportkb/49422/running-kafka-client-bin-scripts-in-secure-envrion.html  Also check if you have a valid Kerberos ticket. If you use kinit, use this configuration.  KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true;
}; 
If you use keytab, use this configuration: KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/etc/security/keytabs/kafka_server.keytab"
principal="kafka/kafka1.hostname.com@EXAMPLE.COM";
}; 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-11-2017
	
		
		04:12 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							@arjun more You don't need to edit this column. As the error itself says "FOREIGN KEY (`upgrade_id`)" this will be set as @Jay SenSharma suggested.  Please check the Type of column you are trying to edit,  mysql> desc clusters;
+-----------------------+--------------+------+-----+---------+-------+
| Field                 | Type         | Null | Key | Default | Extra |
+-----------------------+--------------+------+-----+---------+-------+
| cluster_id            | bigint(20)   | NO   | PRI | NULL    |       |
| resource_id           | bigint(20)   | NO   | MUL | NULL    |       |
| upgrade_id            | bigint(20)   | YES  | MUL | NULL    |       |
| cluster_info          | varchar(255) | NO   |     | NULL    |       |
| cluster_name          | varchar(100) | NO   | UNI | NULL    |       |
| provisioning_state    | varchar(255) | NO   |     | INIT    |       |
| security_type         | varchar(32)  | NO   |     | NONE    |       |
| desired_cluster_state | varchar(255) | NO   |     | NULL    |       |
| desired_stack_id      | bigint(20)   | NO   | MUL | NULL    |       |
  As the Column is of type bigint(20), it's default value is "NULL".  The blank filed will be show in postgresql DB because of the type of column upgrade_id. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
         
					
				













