Member since 
    
	
		
		
		05-05-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                65
            
            
                Posts
            
        
                117
            
            
                Kudos Received
            
        
                7
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 7340 | 06-20-2016 09:08 AM | |
| 1632 | 05-26-2016 01:25 PM | |
| 10917 | 05-26-2016 01:14 PM | |
| 1555 | 05-25-2016 09:14 AM | |
| 2581 | 05-25-2016 09:03 AM | 
			
    
	
		
		
		05-25-2016
	
		
		09:19 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		3 Kudos
		
	
				
		
	
		
					
							 @sanka sandeep As HDP do not support Apache Drill at the moment,you might not be able to Start/Stop Drill service using Ambari-server.  Once you download the Drill packages from the Apache Drill website, you can use bin/drillbit.sh start script to start Drill.  Drill Install Guide (Distributed Mode):  https://drill.apache.org/docs/installing-drill-on-the-cluster/  Drill Install Guide (Embedded Mode)  :https://drill.apache.org/docs/starting-drill-on-linux-and-mac-os-x/ 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-25-2016
	
		
		09:14 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @c pat HDF-1.2 is compatible with HDP2.3+ . Have not seen such compatible matrix for HDF-1.1  http://docs.hortonworks.com/HDPDocuments/HDF1/HDF-1.2/bk_HDF_InstallSetup/content/hdf_supported_hdp.html 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-25-2016
	
		
		09:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		5 Kudos
		
	
				
		
	
		
					
							 @rahul jain From the URL it looksyou are searching for Hive ODBC for Suse OS.  You can download Hive ODBC from here : http://hortonworks.com/downloads/ . Go to Hortonworks ODBC Driver for Apache Hive (v2.1.2) section to choose your OS specific downloads.  URL specific to Suse : http://public-repo-1.hortonworks.com/HDP/hive-odbc/2.1.2.1002/suse11/hive-odbc-native-2.1.2.1002-1.x86_64.rpm  If you are looking for HDP packages in Tarball format : http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.0/bk_Installing_HDP_AMB/content/_hdp_24_repositories.html  Hope this helps 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-23-2016
	
		
		10:50 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @Greenhorn TechieAs mentioned in above comment,  edge not is beneficial when you access HDFS via hadoop API . When you knox , putting data on edge node would be like a hop means it will increase overall time taken to ingest data on HDFS.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-23-2016
	
		
		10:30 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		4 Kudos
		
	
				
		
	
		
					
							 @Greenhorn Techie If your source system has access to Knox exposed WebHDFS then this would be the good way as you would avoid data hop on edge node. This method should take less time to put data on HDFS than putting data via edge node.  Also accessing directly knox exposed WebHDFS will let you avoid SSH access to knox edge node.SO first option looks more secure and fast.  Hope this helps you. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-20-2016
	
		
		04:10 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @petri koskican you share Pig job Logs? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-20-2016
	
		
		04:09 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		6 Kudos
		
	
				
		
	
		
					
							 @elan chelian Can you try your sqoop command with increasing map memory from 256MB o higher value , like this :   sqoop import -D mapreduce.map.memory.mb=2048 -D mapreduce.map.java.opts=-Xmx1024m --connect jdbc:oracle:thin:@oracledbhost:1521:VAEDEV --table WC_LOY_MEM_TXN --username OLAP -P -m 1 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-20-2016
	
		
		09:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Roopa Raphael Supply "hadoop" as your current password then it will prompt for New Password. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-20-2016
	
		
		09:06 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 @Roopa Raphael First login user/pass for Sandbox is :  username : root
password : hadoop  After first login sandbox will ask you to change your password. Supply new password if prompted else use command passwd to change root password  $> passwd  Additionally, default ambari credentials (admin/admin) might not work. You need to reset the ambari-server password. You can refer here: http://hortonworks.com/wp-content/uploads/2016/03/ReleaseNotes_3_1_2016.pdf  Hope this helps! 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-19-2016
	
		
		12:31 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		2 Kudos
		
	
				
		
	
		
					
							 @rahul jain  This might help you  https://bytealmanac.wordpress.com/2012/07/02/assigning-a-static-ip-to-a-vmware-workstation-vm/   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













