Member since 
    
	
		
		
		01-08-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                133
            
            
                Posts
            
        
                31
            
            
                Kudos Received
            
        
                21
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 18427 | 07-18-2018 01:29 AM | |
| 3590 | 06-26-2018 06:21 AM | |
| 6189 | 06-26-2018 04:33 AM | |
| 3069 | 06-21-2018 07:48 AM | |
| 2750 | 05-04-2018 04:04 AM | 
			
    
	
		
		
		07-26-2018
	
		
		11:44 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello,     I changed the jar file name as mysql-connector-java.jar then I get error which password is not suitable for the policy. But removed my redhat machines and created Centos. Now I dont have any error to connecting mysql database.   I still dont know what was the problem.     Thanks,     Huriye 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-20-2018
	
		
		06:20 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							When enabling YARN Cluster Utilization Report. Please make sure that User and Pool are existing. Also make sure that the User is a linux user on all Hadoop nodes.
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-18-2018
	
		
		02:40 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @yassine24,     This shows how to update a service configuration  http://cloudera.github.io/cm_api/docs/python-client/#configuring-services-and-roles     You need to update the config with the attribute and value.  The configuration is JSON format, but the safety valve you want is in XML format.     An example of how to update a safety valve (hdfs in this case) via REST API is here:    curl -iv -X PUT -H "Content-Type:application/json" -H "Accept:application/json" -d '{"items":[{ "name": "core_site_safety_valve","value": "<property><name>hadoop.proxyuser.ztsps.users</name><value>*</value></property><property><name>hadoop.proxyuser.ztsps.groups</name><value>*</value></property>"}]}' http://admin:admin@10.1.0.1:7180/api/v12/clusters/cluster/services/hdfs/config     I am pretty sure you can pass the JSON as shown above in the -d argument hbase.update_config() or whatever 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-18-2018
	
		
		01:29 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							According to the error, it is looking for java 7 installed by cloudera. You should define JAVA_HOME={path_to_your_jdk8_installation} in bashrc.
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-13-2018
	
		
		02:05 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 The developer (customer side) who work with me on the cluster try to use Apache Airflow, and after one week, he can do what we need (workflow, emailing / alerting, re-run, ...) without the load of files into hdfs, Apache airflow is running in standalone mode and the web UI is better than Oozie UI.     It seems a better solution than oozie, what do you think about this ?     As it is an incubating project, I don't know if it's a good idea, but the web UI is good, it looks easy to manage, I didn't know this new project but I think Oozie is outdated compare to Airflow.     For the moment Oozie is in stand-by, they will make a choice between oozie and airflow, but I must admit that Airflow looks a better solution. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-13-2018
	
		
		12:20 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Regarding python 2. If your hive server is configured with SSL, then you should consider installing "sasl" package in python.     As about python3, although this is a python question not hive related, usually the issue is on the previous lines, e.g. quotes or parentheses that do not terminate. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-26-2018
	
		
		12:00 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Here are the output:     sudo -u hdfs hdfs dfs -ls /  Found 3 items  drwxr-xr-x   - hbase  hbase               0 2018-06-26 11:19 /hbase  drwxrwxrwx   - hdfs   supergroup          0 2018-06-26 11:18 /tmp  drwxrwxr-x   - mapred mapred              0 2018-06-26 11:19 /user     I filed another problem. The title is "  Could not find yarn-site.xml, make sure to deploy yarn client in UI". I guess these problems may be related. Can you please take a look? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-26-2018
	
		
		09:46 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							Thank you, I had mentioned alias in front of hostname in /etc/sysconfig/network file. i removed alias issue resolved .. hostname -f command helped me identify the issue  Regards  Siv  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-26-2018
	
		
		04:33 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Ok, from the log it is obvioues that issue for spark is the old jdk.        When you tried to upgrade java have you defined the java home in "/etc/default/cloudera-scm-server"  e.g.:  export JAVA_HOME="/usr/lib/jvm/java-8-oracle/"  Can you send the relevant "/var/log/cloudera-scm-server/cloudera-scm-server.out" ?    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		06-21-2018
	
		
		07:48 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 You should not worry for compatibility between KTS and CDH  If you check https://www.cloudera.com/documentation/enterprise/latest/topics/encryption_ref_arch.html#concept_npk_rxh_1v  CDH connects to KMS.  KMS will connect to KTS  So you have to check whether the KMS which is compatible to KTS3.8, is compatible with CDH5.14.2.    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













