Member since 
    
	
		
		
		12-14-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                70
            
            
                Posts
            
        
                94
            
            
                Kudos Received
            
        
                16
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 7549 | 03-14-2017 03:56 PM | |
| 1879 | 03-07-2017 07:20 PM | |
| 5448 | 01-23-2017 05:57 AM | |
| 7903 | 01-23-2017 05:40 AM | |
| 2147 | 10-18-2016 03:36 PM | 
			
    
	
		
		
		08-01-2016
	
		
		12:40 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 check your clusterhost.txt - did you provide correct IP addresses?? - 10.00.01 is not a valid ip address. You may have missed a '.' between 2 0's (10.0.0.1) instead??  Were you able to ssh to the hosts individually?? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-01-2016
	
		
		12:35 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 How big is your access database? and why are you trying to migrate?  It sounds like you will be better served my moving to MySql - here are the steps to migrate from access to mysql - https://dev.mysql.com/doc/connector-odbc/en/connector-odbc-examples-tools-with-access-export.html  If you want to explore Bigadata, then you can perhaps move to Hive using Sqoop.. but can't really recommend with certainty without knowing the size or use cases. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-31-2016
	
		
		05:09 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Have you explored json serde - https://github.com/rcongiu/Hive-JSON-Serde ??  I would write a utility script that will convert your dataset to json (inclusive of serNo, Country, cities, date) and then load them into hive using json serde  For more details on Hive Serde, refer to https://cwiki.apache.org/confluence/display/Hive/DeveloperGuide#DeveloperGuide-HiveSerDe 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-03-2016
	
		
		08:51 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Thanks to @nate cole and @David Schorow   curl call that returns a response:  http://localhost:8080/api/v1/clusters/c1/services/ZOOKEEPER/components/ZOOKEEPER_CLIENT?format=client_config_tar 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-03-2016
	
		
		08:49 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Ambari UI has an option to download client configuration, however the need is to use it part of a custom script. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ambari
			
    
	
		
		
		05-06-2016
	
		
		02:49 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Because this is the question that I get asked most often times from enterprise customers in the field, here is the answer:  Ranger currently supports resource based and tag based policies for Hive (HDFS files, HBase, etc...), where you can specify a column to be un-authorized for a specific user or user group. This will fail the query by that user/group altogether.  However, there is a work in progress to make queries involving the un-authorized columns to simply mask (redact) the data instead of failing altogether. Here is the jira number https://hortonworks.jira.com/browse/RMP-3705 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-06-2016
	
		
		02:45 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 This is a most common question that I get asked from the customers - who says that failing a hive query altogether makes no sense for them in the enterprise environment, but rather want to have it redacted. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ranger
			
    
	
		
		
		05-05-2016
	
		
		09:48 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @John Yawney - HDP 2.4.2 is being targeted for May 2016 (in next few days), which I think will have Atlas 0.6 in it. Hopefully that helps. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-05-2016
	
		
		06:58 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Looks like you are running the code as 'bigotes' user. Can you check if that is correct and you have sufficient write privileges in the user directory? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-05-2016
	
		
		06:53 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Only Atlas 0.5 is packaged into HDP 2.4. If you want to manually upgrade, you have to first remove the component from Ambari, and then manage atlas by yourself. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
         
					
				













