Member since 
    
	
		
		
		02-27-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                171
            
            
                Posts
            
        
                9
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		04-03-2017
	
		
		10:58 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Deepak Sharma   I am running below mentioned command  curl -iku steve:steve-password -X GET "https://{lknox_gateway}:8443/gateway/default/webhdfs/v1/tmp?op=LISTSTATUS"  While running below it is also getting failed with error below  . knoxcli.sh --d user-auth-test --cluster default --u steve --p steve-password  Caused by: javax.naming.AuthenticationException: [LDAP: error code 49 - INVALID_CREDENTIALS: Bind failed: ERR_229 Cannot authenticate user uid=steve,ou=people,dc=hadoop,dc=apache,dc=org]  How to do ldap seach? I am using internal ldap provided with knox with default topology file(default.xml)  I have copied the same template for other users(i.e. tom,guest) present in users.ldif file and modified it to match steve. I am not understanding why is it not working although the steps looks right? Do i need to modify any other file as well in addition to users.ldif? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-03-2017
	
		
		10:08 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Deepak Sharma @Rahul Pathak  Could you help? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-03-2017
	
		
		09:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I have installed knox as a service using Ambari(on Edge Node) on my 6 Node HDP 2.5 cluster(1 Edge Node, 1 Namenode, 1 Secondary Namenode and 3 Slave Nodes). Knox gateway and ldap Server is up and running on Edge Node.   I am also able to authenticate to webhdfs using existing users like guest, tom, sam in users.ldif file. But when i add a new user i.e. scott into users.ldif file it is showing Unauthorized error in HDFS? I have copied the template of existing user "tom" and added new user as mentioned below. Any idea why i am facing such issue?  dn: uid=steve,ou=people,dc=hadoop,dc=apache,dc=org   objectclass:top  
objectclass:person   objectclass:organizationalPerson   objectclass:inetOrgPerson  
cn: scott   sn: scott   uid: scott   userPassword:steve-password  I am facing below mentioned error for new users   [LDAP: error code 49 - INVALID_CREDENTIALS: Bind failed: ERR_229 Cannot authenticate user uid=steve,ou=people,dc=hadoop,dc=apache,dc=org] 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Knox
			
    
	
		
		
		03-31-2017
	
		
		10:05 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Rahul Pathak   Thanks for your answer.   I was looking at below mentioned Knox Installation document. One thing does kerberos needs to be setup before installing Ranger/Ranger KMS? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-31-2017
	
		
		10:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Deepak Sharma   Thanks for your answer.  1) Can i create Ranger DB's on existing Postgres Ambari DB and then provide DB configuration during Ambari installatio. Is that a viable solution?  2) Does kerberos needs to be setup before installing Ranger/Ranger KMS or is there a way to set up Ranger/Ranger KMS on non kerborized cluster?  Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-31-2017
	
		
		09:13 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I am planning to Install Ranger and Ranger KMS on 6 Node cluster managed on Azure VM's. I already have default Ambari postgresql DB installed on Ambari Server . Can anyone please clarify below mentioned doubts i have for ranger installation  1) Can i use the same  PostgreSQL DB for creating ranger Audit and Policy DB's?  2) I already have Ambari Infra installed on Cluster? Do i need to use URL as http://solr_host:6083/solr/ranger_audits? where solr_host will be the hostname of machine where Ambari Metrics is installed?  3) Do we need to setup Knox with kerberos before installing Ranger or can it be done later as well?  Any help would be appreciated?  Thanks  Rahul 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ranger
			
    
	
		
		
		03-10-2017
	
		
		02:46 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Mats Johansson   Thanks for sharing the technical User guide.  Which version of Google Chrome are you using. I am unable to search newly created Hive tables  from Atlas UI Text search? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-10-2017
	
		
		09:22 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I am trying to understand the exact significance of hdfs_path in Apache Atlas Search. I created a path in HDFS but i am not able to search that path using Atlas UI. Does it signify something else?  I have installed HDP 2.5 and using Atlas on HDP 2.5. One more thing Is Atlas Web UI browser specific? I am able to search newly created hive tables in IE but not in Chrome and Firefox.? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Atlas
			
    
	
		
		
		03-08-2017
	
		
		05:44 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Artem Ervits   Ok sure. Just wanted to know about Namenode as well in case that also needs an upgrade to VM of higher memory and CPU. What should be the ideal scenario in that case?  Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













