Member since 
    
	
		
		
		09-24-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                816
            
            
                Posts
            
        
                488
            
            
                Kudos Received
            
        
                189
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 3111 | 12-25-2018 10:42 PM | |
| 13999 | 10-09-2018 03:52 AM | |
| 4689 | 02-23-2018 11:46 PM | |
| 2405 | 09-02-2017 01:49 AM | |
| 2825 | 06-21-2017 12:06 AM | 
			
    
	
		
		
		01-18-2016
	
		
		11:29 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Looks like a repository issue. Check your *.list files and try 
running "apt-get update" as suggested. Also, instead of downloading 
packages directly from the Internet it's better to setup local 
repositories, you can find details here   http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.0.0/bk_Installing_HDP_AMB/content/_using_a_local_repository.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-17-2016
	
		
		02:21 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Michael M thanks for clarification. @Roman Sydorov Please see my edited answer above. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-14-2016
	
		
		05:22 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Roman Sydorov I haven't tried but I don't see any reason why not. You can start each Flume version with the command below, just create separate conf directories and set respective classpaths in flume-env.sh. Each one can run multiple agents and each agent will run as a separate process. However, from Ambari you can start and manage only one Flume version specified by default in /etc/flume/conf.   /usr/hdp/current/flume-server/bin/flume-ng agent -c /etc/flume/conf -f /etc/flume/conf/flume.conf -n agent
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-14-2016
	
		
		04:20 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 Also works with Ambari-2.0.x, we have a few "small" clusters running only Kafka, ZooKeeper and Flume. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-07-2016
	
		
		04:20 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Paul Wilson The delimiter in "FIELDS TERMINATED BY" has to match the delimiter used in the input file. After "LOAD DATA ..." run for example "SELECT id, career from yep" to make sure loading of the Hive table was successful. If it is but the job still fails then it can be something related to permissions. Can you run other Hive queries using MR? And what version of HDP/Hive/HBase are you using? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-31-2015
	
		
		09:54 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 In your Hive table definition, it should be ... FIELDS TERMINATED BY ',' ... by comma not by space. Otherwise it's fine and it works, I had a few free moments and just tried on HDP-2.3.2 sandbox. Also, you may wish to remove the table header when working with real data. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-26-2015
	
		
		01:12 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Yes, you can stop Oozie and Hive, also HBase and Ranger if they are running. Then, depending on how much RAM do you have on your laptop, you can check and reduce Base Memory in your VM's Settings-->System. The default for Sandbox on Mac is 8G, not sure about Windows. If you are just after spark-shell you can also stop Ambari (ambari-server and ambari-agent). 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-24-2015
	
		
		04:53 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Darpan Patel Regarding NN HA support, as I mentioned above, based on our recent experience with Ambari-2.1.2.1 in a kerberized cluster, Files and Hive views support NN HA, while Pig view doesn't. I haven't had time to explore Ambari-2.2 yet.
  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-23-2015
	
		
		04:14 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Is there any special reason you are using http Hive transport mode? [For example, Knox requires http mode.] If not, then set the transport mode to binary and Hive view should work. If you want to keep the http transport than you need Ambari-2.1.2.1 or 2.2. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-23-2015
	
		
		03:06 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Okay, what's the status of the Files view now? Can you now browse the files? Also try to restart ambari-server just in case.  Regarding Hive error, what's your Hive transport mode, binary or http? Only Hive view packaged with Ambari-2.1.2.1 (and I guess 2.2) supports http mode, old Ambari versions support only binary mode. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		- « Previous
- Next »
 
         
					
				













