128
            
            
                Posts
            
        
                15
            
            
                Kudos Received
            
        
                8
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 3458 | 01-13-2015 09:09 AM | |
| 5491 | 05-28-2014 09:28 AM | |
| 2457 | 04-22-2014 01:24 PM | |
| 2282 | 03-31-2014 09:07 AM | |
| 69351 | 02-07-2014 08:40 AM | 
			
    
	
		
		
		05-28-2014
	
		
		09:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							Make sure you have the hostnames (/etc/hosts) setup properly on all the hosts...Try to ping the host from host and see you get the reply...  Resource busy mean the process is still running kill the process and start over making sure you have the correct hostnames or ip addresses.
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-28-2014
	
		
		09:25 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							Can you share the error log ?
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-22-2014
	
		
		01:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							I don't think cloudera manager or hadoop framework handles unicast...Every server in hadoop echo system are connected and communicated all the time.
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-31-2014
	
		
		09:07 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Based on the operating system you use, navigate to the repo file for your system and save it in the/etc/yum.repos.d/directory.     For Red Hat/CentOS/Oracle 5 (add the below lines in /etc/yum.repos.d/cloudera-cdh5)     [cloudera-cdh5]
# Packages for Cloudera's Distribution for Hadoop, Version 5, on RedHat or CentOS 5 x86_64
name=Cloudera's Distribution for Hadoop, Version 5
baseurl=http://archive.cloudera.com/cdh5/redhat/5/x86_64/cdh/5/
gpgkey = http://archive.cloudera.com/cdh5/redhat/5/x86_64/cdh/RPM-GPG-KEY-cloudera 
gpgcheck = 1  For Red Hat/CentOS/Oracle 6 (64-bit)     [cloudera-cdh5]
# Packages for Cloudera's Distribution for Hadoop, Version 5, on RedHat	or CentOS 6 x86_64
name=Cloudera's Distribution for Hadoop, Version 5
baseurl=http://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/5/
gpgkey = http://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/RPM-GPG-KEY-cloudera    
gpgcheck = 1     After doing this, you can now see the packages listed in yum list cloudera*     Hope this helps.    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-18-2014
	
		
		08:13 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							awesome...the conf change resolved the issue. Thanks a lot !
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-18-2014
	
		
		07:15 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							CDH Version: cdh4.5.0    Yes we are using CM.    CM version: Cloudera Enterprise 4.8.0
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-18-2014
	
		
		07:07 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 my workflow.xml      <workflow-app xmlns="uri:oozie:workflow:0.4" name="hive-wf">    <credentials>  <credential name='hive_credentials' type='hcat'>  <property>  <name>hcat.metastore.uri</name>  <value>thrift://hostname.com:9083</value>  </property>  <property>  <name>hcat.metastore.principal</name>  <value>hive/_HOST@PRINCIPAL.COM</value>  </property>  </credential>  </credentials>    <start to="hive-node"/>  <action name="hive-node" cred="hive_credentials">  <hive xmlns="uri:oozie:hive-action:0.2">  <job-tracker>${jobTracker}</job-tracker>  <name-node>${nameNode}</name-node>  <job-xml>/user/someuser/hive-site.xml</job-xml>  <configuration>    <property>  <name>oozie.hive.defaults</name>  <value>/user/someuser/hive-default.xml</value>  </property>  </configuration>  <script>script.q</script>  </hive>  <ok to="end"/>  <error to="fail"/>  </action>  <kill name="fail">    <message>       ${wf:name()} Work Flow Process failed miserably.          Work Flow Id{wf:id()}             Work Flow Name{wf:name()}                Error Node{wf:lastErrorNode()}                   Error Message{wf:errorMessage(wf:lastErrorNode())}                     </message>                      </kill>  <end name="end"/>  </workflow-app>     
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-14-2014
	
		
		02:59 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I got the below error while running the oozie workflow....  ID : 0000004-140214173424343-oozie-oozi-W@hive-node  ------------------------------------------------------------------------------------------------------------------------------------  Console URL       : -  Error Code        : JA020  Error Message     : JA020: Could not load credentials of type [hcat] with name [hive_credentials]]; perhaps it was not defined in oozie-site.xml?  External ID       : -  External Status   : ERROR  Name              : hive-node  Retries           : 0  Tracker URI       : -  Type              : hive  Started           : 2014-02-14 22:49 GMT  Status            : ERROR  Ended             : 2014-02-14 22:49 GMT  ------------------------------------------------------------------------------------------------------------------------------------      Does anyone have an idea about th eissue ? am I missing something ?  Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Oozie
 
			
    
	
		
		
		02-07-2014
	
		
		08:40 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							This solved my problem....I was using the wrong syntax. Below is the right one...  !connect jdbc:hive2://hostname:10000/default;principal=hive/hostname@PRINCIPAL.COM  username and password blank....    Thanks for helping me out zhang...
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-07-2014
	
		
		06:48 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							No luck.  0: jdbc:hive2://hostname.c> show databases;  No current connection  0: jdbc:hive2://hostname.c>
						
					
					... View more