Member since 
    
	
		
		
		12-09-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                115
            
            
                Posts
            
        
                43
            
            
                Kudos Received
            
        
                12
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 9539 | 07-10-2017 09:38 PM | |
| 6766 | 04-10-2017 03:24 PM | |
| 1864 | 03-04-2017 04:08 PM | |
| 6121 | 02-17-2017 10:42 PM | |
| 7447 | 02-17-2017 10:41 PM | 
			
    
	
		
		
		05-29-2020
	
		
		10:11 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,      has anyone get this running and can post an running example ?      thx  marcel        
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-02-2018
	
		
		11:42 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							  @Raja Sekhar Chintalapati,  The below command should do   hdfs dfs -ls -R / | awk '{ if ( $3 == "spark" && substr($0,0,1) != "d" ) { print $8 } }' | xargs hdfs dfs -rm  In the above command "spark" is the user name. Replace it with your username. Also I considered path as '/' . If you want to delete files only in a certain directory , replace it with your directory.  This will remove only the files owned by the user and not the directories.  Thanks,  Aditya 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-15-2018
	
		
		08:21 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi,  Is there a workaround solution for the above or is it a behaviour of 2.6.3 and above? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		08-14-2018
	
		
		06:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 What changes you had to do in /etc/hosts file on RegionServer/Master? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-05-2017
	
		
		02:21 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 This should have been automatically created for you if you entered CHRSV@COM in the "Additional Realms" box on the Configure Identities in the Enable Kerberos Wizard.    Assuming that you didn't do this, how was the krb5.conf file set up to acknowledge the trusted realm? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-21-2017
	
		
		03:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @  Raja Sekhar Chintalapati  Of course if you setup a KDC the keytabs are valid but you need to grab a valid one to proceed! Get the list of keytabs  List all valid keytabs  $ ls /etc/security/keytabs   List valid principals for this keytab  $ klist -kt /etc/security/keytabs/hive.service.keytab 
Keytab name: FILE:/etc/security/keytabs/hive.service.keytab 
KVNO Timestamp         Principal
---- ----------------- --------------------------------------------------------    
1 02/02/17 23:00:12 hive/Ambari-Host_name@YOUR_REALM.COM    
1 02/02/17 23:00:12 hive/Ambari-Host_name@YOUR_REALM.COM   Grab a valid ticket  $ kinit -kt /etc/security/keytabs/hive.service.keytab hive/Ambari-Host_name@YOUR_REALM.COM   Check validity   $ klist Ticket cache: FILE:/tmp/krb5cc_504
Default principal: hive/Ambari-Host_name@YOUR_REALM.COM 
Valid starting     Expires            Service principal
02/10/17 01:32:45  02/11/17 01:32:45  krbtgt/YOUR_REALM.COM@YOUR_REALM.COM         
renew until 02/10/17 01:32:45   Grab a valid ticket   $ kinit -kt /etc/security/keytabs/hive.service.keytab hive/Ambari-Host_name@YOUR_REALM.COM    This should have been the correct connect string if you had a valid ticket  beeline -u jdbc:hive2://hiveServer2_hostname:10000;principal=hive/Keytab@PRINCIPAL  With the above you should successfully log on and execute your HQL 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-08-2017
	
		
		08:51 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Soon with the application lifetime feature (https://issues.apache.org/jira/browse/YARN-3813) you wouldn't have to write such scripts. You would simply set lifetime of the app to 20 mins during creation and it would kill itself on the 20th min mark.  Until then, something like this script might help (assuming you have access to yarn command line). This script can be easily enhanced to run as a cronjob if required, where say it will trigger every 1 min and look for specific apps which has crossed certain lifetime and kill them.  Hope this helps -  #!/bin/bash
if [ "$#" -lt 2 ]; then
  echo "Usage: $0 <app_id> <max_life_in_mins>"
  exit 1
fi
finish_time=`yarn application -status $1 2>/dev/null | grep "Finish-Time" | awk '{print $NF}'`
if [ $finish_time -ne 0 ]; then
  echo "App $1 is not running"
  exit 1
fi
time_diff=`date +%s`-`yarn application -status $1 2>/dev/null | grep "Start-Time" | awk '{print $NF}' | sed 's!$!/1000!'`
time_diff_in_mins=`echo "("$time_diff")/60" | bc`
echo "App $1 is running for $time_diff_in_mins min(s)"
if [ $time_diff_in_mins -gt $2 ]; then
  echo "Killing app $1"
  echo yarn application -kill $1
else
  echo "App $1 should continue to run"
fi
 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-29-2018
	
		
		09:02 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Remember If you have KAFKA : you need to change at config -> kafka brokers ->   listeners back to PLAINTEXT://localhost:6667 (from PLAINTEXTSASL://localhost:6667) 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-10-2017
	
		
		04:55 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello  Having the same issue 😞  I find in the hive meta store log  2017-03-10 16:50:52,164 INFO  [main]: zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:user.name=hive  No idea where this is coming from though  All tips appreciated!
  Peter 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-17-2017
	
		
		10:42 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 this is because mysql is external to ambari and when kerberos is enabled ambari is not smart enough to recognize mysql and it didnot create keytabs for mysql. that was the reason hive was not able to start.  i still need to find out a way to create keytabs for non ambari components. as of now i moved these components to another server where all the services were deployed through ambari.  thanks to all for your help so far. 
						
					
					... View more