Member since 
    
	
		
		
		09-02-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                523
            
            
                Posts
            
        
                89
            
            
                Kudos Received
            
        
                42
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2723 | 08-28-2018 02:00 AM | |
| 2695 | 07-31-2018 06:55 AM | |
| 5676 | 07-26-2018 03:02 AM | |
| 2977 | 07-19-2018 02:30 AM | |
| 6459 | 05-21-2018 03:42 AM | 
			
    
	
		
		
		01-05-2017
	
		
		05:22 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @benassi     As we know "Yarn Aggregate Log Retention" can control only YARN but /tmp/logs is not limited to YARN     So Can you check the YARN log date using below steps.   CM -> Yarn -> Web UI -> Resource Manager web UI -> (it will open 8088 link) Click on Finished link (left side) -> Come down and click on 'Last' button -> Check the log date -> You should see only one day history data as you configured to 1 day     Note: Make sure CM-> Yarn -> Configuration -> Enable Log Aggregation = Enabled     Thanks  Kumar 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-04-2017
	
		
		11:52 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @bgooley You are correct, I am getting the privilege error when I use kadmin but it is working fine with kadmin.local... I understand Generate Missing Credential will use kadmin instead of kadmin.local, so this is causing the trouble.     [root@abc]# kadmin
Authenticating as principal root/admin@REALM.COM with password.
Password for root/admin@REALM.COM:
kadmin:  addprinc -maxrenewlife "432000 sec" -randkey -pw hadoop1 solr/<<my_IP>>@REALM.COM
WARNING: no policy specified for solr/<<my_IP>>@REALM.COM; defaulting to no policy
add_principal: Operation requires ``add'' privilege while creating "solr/<<my_IP>>@REALM.COM".c
[root@abc]# kadmin.local
kadmin.local:  addprinc -maxrenewlife "432000 sec" -randkey -pw hadoop1 solr/<<my_IP>>@REALM.COM
WARNING: no policy specified for solr/<<my_IP>>@REALM.COM; defaulting to no policy
Principal "solr/<<my_IP>>@REALM.COM" created.  I tried to Import the credential using CM -> Admin -> Security. It says success message but I list the Kerberos credential, the principal is still missing for only solr     Successfully imported KDC Account Manager credentials.     so I've deleted the principal that i've added manually using kadmin.local.....  How to fix the issue with kadmin? so that I can use Generate Missing Credential option     Here I've listed my configuration, do you think any change required on it?  cat /var/kerberos/krb5kdc/kadm5.acl
*/admin@REALM.COM *
hive@REALM.COM *
hdfs@REALM.COM
  ###
kadmin.local:  listprincs
HTTP/<<my_ipaddress>>@REALM.COM
K/M@REALM.COM
cloudera-scm/admin@REALM.COM
hdfs/<<my_ipaddress>>@REALM.COM
hdfs@REALM.COM
hive/<<my_ipaddress>>@REALM.COM
hue/<<my_ipaddress>>@REALM.COM
impala/<<my_ipaddress>>@REALM.COM
kadmin/admin@REALM.COM
kadmin/changepw@REALM.COM
kadmin/<<my_ipaddress>>@REALM.COM
krbtgt/REALM.COM@REALM.COM
mapred/<<my_ipaddress>>@REALM.COM
oozie/<<my_ipaddress>>@REALM.COM
root/admin@REALM.COM
root@REALM.COM
sentry/<<my_ipaddress>>@REALM.COM
yarn/<<my_ipaddress>>@REALM.COM
zookeeper/<<my_ipaddress>>@REALM.COM     I've confirmed that my Fully qualified Domain Name (FQDN) is correct with my configurations     Note: I am using admin login in Cloudera manager to generate new principal and root/admin@REALM in CLI to add new principal       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-04-2017
	
		
		09:32 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @bgooley Thanks for quick reply. Let me double check all the points that you have mentioned.     In the mean time, I am still not clear with one point....     I believe my /var/kerberos/krb5kdc/kadm5.acl and other configurations are fine, because As I mentioned already, all the existing services (HDFS, Hive, Impala, Oozie, Hue, etc) are working fine. If there is a problem with my configuration, I should get the same error for my the existing services right.. why should I get error for only new service?     The only difference between existing and new services are  1. Existing services are added before enable Kerberos (everything is ok)  2. Trying to add New services now after enable Kerberos     any idea? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-04-2017
	
		
		08:52 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi     CDH 5.7.x     I used to add new services in our cluster using Cloudera Manager without any issue before enable Kerberos. We have installed/enabled Kerberos now and everything is good for the existing services     But I want to add new service (Solr) and getting the following error      Start Solr: Failed to start service
Execute command Start this Solr Server on role Solr Server: Command failed to run because this role has invalid configuration.   Review and correct its configuration. First error: Role is missing Kerberos keytab.   Please run the Generate Missing Credentials command on the   Kerberos Credentials tab of the Administration -> Security page     I hv tried to Generate Missing Credentials in Admin -> security page but it end up with following error        /usr/share/cmf/bin/gen_credentials.sh failed with exit code 1 and output of <<
+ export PATH=/usr/kerberos/bin:/usr/kerberos/sbin:/usr/lib/mit/sbin:/usr/sbin:/usr/lib/mit/bin:/usr/bin:/sbin:/usr/sbin:/bin:/usr/bin
+ PATH=/usr/kerberos/bin:/usr/kerberos/sbin:/usr/lib/mit/sbin:/usr/sbin:/usr/lib/mit/bin:/usr/bin:/sbin:/usr/sbin:/bin:/usr/bin
+ CMF_REALM=REALM.COM
+ KEYTAB_OUT=/var/run/cloudera-scm-server/cmf6942980384105255302.keytab
+ PRINC=solr/<<my_ipaddress>>@REALM.COM
+ MAX_RENEW_LIFE=432000
+ KADMIN='kadmin -k -t /var/run/cloudera-scm-server/cmf2028852611455413307.keytab -p root/admin@REALM.COM -r REALM.COM'
+ RENEW_ARG=
+ '[' 432000 -gt 0 ']'
+ RENEW_ARG='-maxrenewlife "432000 sec"'
+ '[' -z /var/run/cloudera-scm-server/krb5920427054266466413.conf ']'
+ echo 'Using custom config path '\''/var/run/cloudera-scm-server/krb5920427054266466413.conf'\'', contents below:'
+ cat /var/run/cloudera-scm-server/krb5920427054266466413.conf
+ kadmin -k -t /var/run/cloudera-scm-server/cmf2028852611455413307.keytab -p root/admin@REALM.COM -r REALM.COM -q 'addprinc -maxrenewlife "432000 sec" -randkey solr/<<my_ipaddress>>@REALM.COM'
WARNING: no policy specified for solr/<<my_ipaddress>>@REALM.COM; defaulting to no policy
add_principal: Operation requires ``add'' privilege while creating "solr/<<my_ipaddress>>@REALM.COM".
+ '[' 432000 -gt 0 ']'
++ kadmin -k -t /var/run/cloudera-scm-server/cmf2028852611455413307.keytab -p root/admin@REALM.COM -r REALM.COM -q 'getprinc -terse solr/<<my_ipaddress>>@REALM.COM'
++ tail -1
++ cut -f 12
get_principal: Operation requires ``get'' privilege while retrieving "solr/<<my_ipaddress>>@REALM.COM".
+ RENEW_LIFETIME='Authenticating as principal root/admin@REALM.COM with keytab /var/run/cloudera-scm-server/cmf2028852611455413307.keytab.'
+ '[' Authenticating as principal root/admin@REALM.COM with keytab /var/run/cloudera-scm-server/cmf2028852611455413307.keytab. -eq 0 ']'
/usr/share/cmf/bin/gen_credentials.sh: line 35: [: too many arguments
+ kadmin -k -t /var/run/cloudera-scm-server/cmf2028852611455413307.keytab -p root/admin@REALM.COM -r REALM.COM -q 'xst -k /var/run/cloudera-scm-server/cmf6942980384105255302.keytab solr/<<my_ipaddress>>@REALM.COM'
kadmin: Operation requires ``change-password'' privilege while changing solr/<<my_ipaddress>>@REALM.COM's key
+ chmod 600 /var/run/cloudera-scm-server/cmf6942980384105255302.keytab
chmod: cannot access `/var/run/cloudera-scm-server/cmf6942980384105255302.keytab': No such file or directory
>>     So I've manually added "solr/<<my_ipaddress>>@REALM.COM" using kadmin.local and tried to import from Admin -> security page.. no luck     so now my questions are  1. Is there any prequest to add a new service in Kerberoized cluster?   2. I cannot simply press "Generate Missing Credentials in Admin -> security page" Becuase How does my cluster knows which service I am going to add... it can be Solr, or something else?? Still I tried but it says nothing to generate     Thanks  Kumar    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Kerberos
			
    
	
		
		
		12-31-2016
	
		
		04:56 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @cdhnaidu     Hope you have enabled sentry ...     I can see that you login as hive user in hue. Can you login as admin instead of hive in hue and try...     Note: make sure you have required user, group, roles in both Hadoop and Hue. Also check your admin configuration setting in CM -> Sentry -> Configruation.     Few sample commands  beeline>Create role admin;  Granted priviledges to admin role.  GRANT ALL ON SERVER server1 TO ROLE admin WITH GRANT OPTION;  Assign the role to a group.  GRANT ROLE admin TO GROUP administrators;  After these steps all users within the group administrators are allowed to manage hive priviledges 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-23-2016
	
		
		12:30 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @zhuw.bigdata     I hope you are done with Import KDC Acc Manager Credential already using the following steps" CM -> Administration -> Setting -> Import KDC Account Manager Credentials"     And now you want to change the credential     In your CLI, type kadmin.local (if you are in Kerberos master node) --or-- kadmin (if you are from client/remote node)  kadmin.local: ?    # Type ?, it will give you help including how to change credentials     Hope this helps    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-20-2016
	
		
		01:27 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							    From your examples, i hope you are getting this issue for almost all the tables....     1. Can you try to access the table from Impala?  2. Pls do not forget to execute "invalidate metadata table"  3. Are you getting this issue for only existing tables? Can you create a new table in hive and test it? if new table works, compare the configuration difference between old & new tables 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-20-2016
	
		
		12:21 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 If not aware already, do not get confuse with /apps/hive/warehouse/      in Cloudera Hive meta store databases are usually created under /user/hive/warehouse where as in Hortonworks distribution it is usually under /apps/hive/warehouse/     bottom line of that link is to make sure the Hive Warehouse Directory should be match in all possible areas 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-20-2016
	
		
		12:08 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Try this     https://community.hortonworks.com/content/supportkb/48759/javalangillegalargumentexception-wrong-fs-running.html       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













