Member since 
    
	
		
		
		01-30-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                36
            
            
                Posts
            
        
                1
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		04-14-2017
	
		
		11:14 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @mathieu.d @mbigelow Thank you both of you I was able to achieve the desired result.     First stored the logs in local and then uploaded them to HDFS 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-14-2017
	
		
		07:36 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @mathieu.d Here I am creating files in /tmp folder on datanodes right.     Can I do hdfs dfs -put /tmp/$USER/...success.log /user/$USER/logs/...success.log     Will this work ?  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-14-2017
	
		
		04:36 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							I mean how can we check the folders on datanodes. I don't know how to do it  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-14-2017
	
		
		03:04 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							How can i do that?  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-13-2017
	
		
		12:16 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @mbigelow I have tried like below     mkdir -p /tmp/$USER/logs   touch /tmp/$USER/logs/${TIMESTAMP}.success_log     But in the /tmp folder I don't see any folder called logs and cannot find the file.     but when I go to /tmp folder on the edgenode I can create files and directories.     Please advise where the problem is occuring 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-12-2017
	
		
		09:27 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @mbigelow so If i save the logs to /tmp/sanje so I will be able collect them irrespective of the node it runs. Is this correct.     What about these:     LOG_LOCATION=/home/$USER/logs  exec 2>&1     These also should be on the /tmp folder is this correct     and then move them to HDFS    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-12-2017
	
		
		08:52 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @mbigelow As you can see in my script I am using touch command to create files in Linux but when I schedule the script in oozie, It throws out error touch cannot create file or directory.     I don't know why this is happening 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-08-2017
	
		
		06:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I have a shell script in HDFS. I want to schedule this script in oozie.          #!/bin/bash       LOG_LOCATION=/home/$USER/logs       exec 2>&1       [ $# -ne 1 ] && { echo "Usage : $0 table ";exit 1; }          table=$1          TIMESTAMP=`date "+%Y-%m-%d"`       touch /home/$USER/logs/${TIMESTAMP}.success_log       touch /home/$USER/logs/${TIMESTAMP}.fail_log       success_logs=/home/$USER/logs/${TIMESTAMP}.success_log       failed_logs=/home/$USER/logs/${TIMESTAMP}.fail_log            #Function to get the status of the job creation       function log_status       {       status=$1       message=$2       if [ "$status" -ne 0 ]; then       echo "`date +\"%Y-%m-%d %H:%M:%S\"` [ERROR] $message [Status] $status : failed" | tee -a      "${failed_logs}"       #echo "Please find the attached log file for more details"      exit 1      else      echo "`date +\"%Y-%m-%d %H:%M:%S\"` [INFO] $message [Status] $status : success" | tee -a    "${success_logs}"      fi      }         `hive -e "create table testing.${table} as select * from database.${table}"`         g_STATUS=$?      log_status $g_STATUS "Hive create ${table}"      *******************************************************************************************************************************     I have a some questions regarding using oozie to schedule shell scripts.     1) In my script I have failed and success logs which give me the result of the script whether it is successful or failed. Can we have this kind of feature in HDFS also while using oozie?     2) IN the script I am also collecting the stdout logs as you can see in the 2nd and 3rd lines after the shebang in my script. Can this also be achieved in HDFS?     If so how can we achieve these both in `HDFS` while scheduling shell scripts using oozie.     Could anyone explain please     If there are bettere ways to do things in oozie please let me know 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Oozie
 
			
    
	
		
		
		03-22-2017
	
		
		02:44 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @DWinters     How were you able to overcome this issue? Could you please post a sample solution 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-22-2017
	
		
		11:00 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @maurin     I am getting the below error  TypeError: 'StructField' object has no attribute '__getitem__'      at   df_schema_dict_by_name = dict((d['name'], dict(d, index=index)) for (index, d) in enumerate(df_schema_dict))   How do I rectify this  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		- « Previous
 - 
						
- 1
 - 2
 
 - Next »