Member since 
    
	
		
		
		07-08-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                46
            
            
                Posts
            
        
                5
            
            
                Kudos Received
            
        
                2
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 7102 | 07-21-2016 08:36 AM | |
| 5096 | 07-12-2016 11:58 AM | 
			
    
	
		
		
		06-20-2018
	
		
		02:39 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi. I have a problem with exporting Hive table to Oracle database. I wanna encrypt and hide password using jceks. I read great article about using jceks while importing data using Sqoop: Storing Protected Passwords in Sqoop  It works great when I import data from Oracle to Hive. But the problem is that when I try to export data from Hive to Oracle I get an error:  Unable to process alias  My Sqoop command which I try to run:  sqoop export \
-Dhadoop.security.credential.provider.path=jceks://hdfs/user/hdfs/pass-enc.jceks \
--connect jdbc:oracle:thin:@1.1.1.1:2222:SID \
--table hive_temp_table_orc \
--username orc_user \
--password-alias oracle.password \
--hcatalog-database default \
--hcatalog-table hive_temp_table  \
--hive-partition-key col1 \
--hive-partition-value 2011-01-01  My question is - is that possible to use jceks and --password-alias parameter with Sqoop export command? Or is it an option only when I importing data? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Sqoop
			
    
	
		
		
		02-13-2018
	
		
		08:37 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 You should log in to your machine as a root user and install nc package. I don't know which OS you are using, but if it is CentOS you should execute this command (on every host machine):  yum install nc   
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-19-2017
	
		
		08:00 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@mqureshi But the problem is that I don't have subscription so I don't have access to Smartsense. What in that case? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-18-2017
	
		
		03:41 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi. I have a qestion regarding hdp-conf-utils script and Ambari recommendations. I installed 8 node managers on my cluster. Node hardware spec: cores - 4, ram - 15, disk - 4.  I execute hdp-conf-utils.py script and I got something like that:  Using cores=4 memory=15GB disks=4 hbase=False
 Profile: cores=4 memory=14336MB reserved=1GB usableMem=14GB disks=4
 Num Container=8
 Container Ram=1792MB
 Used Ram=14GB
 Unused Ram=1GB
 ***** mapred-site.xml *****
 mapreduce.map.memory.mb=1792
 mapreduce.map.java.opts=-Xmx1280m
 mapreduce.reduce.memory.mb=3584
 mapreduce.reduce.java.opts=-Xmx2560m
 mapreduce.task.io.sort.mb=640
 ***** yarn-site.xml *****
 yarn.scheduler.minimum-allocation-mb=1792
 yarn.scheduler.maximum-allocation-mb=14336
 yarn.nodemanager.resource.memory-mb=14336
 yarn.app.mapreduce.am.resource.mb=1792
 yarn.app.mapreduce.am.command-opts=-Xmx1280m
 ***** tez-site.xml *****
 tez.am.resource.memory.mb=3584
 tez.am.java.opts=-Xmx2560m
 ***** hive-site.xml *****
 hive.tez.container.size=1792
 hive.tez.java.opts=-Xmx1280m
 hive.auto.convert.join.noconditionaltask.size=402653000
  I wanted to set this recommendations to YARN, but Ambari recommends me something else:  yarn.nodemanager.resource.memory-mb=5120
yarn.scheduler.minimum-allocation-mb=512
yarn.scheduler.maximum-allocation-mb=5120
  Can anyone explain me why Ambari and hdp-conf-utils recommend something else? I would be really grateful. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache YARN
			
    
	
		
		
		10-17-2017
	
		
		03:05 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I saw both of this sites. But my question is - how I can run stack-advisor.py script? Can I just run simple:  python stack-advisor.py  or I should configure something before? Where I can find a file with recommendations? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-17-2017
	
		
		09:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi. I have a question regarding Stack Advisor script. I want to tune my cluster - change YARN, Tez and Hive configuration files. I can use hdp-configuration-utils.py script, but it is too simple. I read that there is stack and service advisors which can give me recommendations. My question is - can I use stack advisor script after HDP installation or I should not use it, cause Ambari uses it during HDP installation only? And if I can use it - how can I do this? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ambari
- 
						
							
		
			Apache Tez
- 
						
							
		
			Apache YARN
			
    
	
		
		
		09-28-2017
	
		
		07:40 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I have setup https for Ambari. It works good, but I have a problem with redirecting. When I try http url, I see an error that page is not available. The page works fine when I try https url. Is there any way to redirect Ambari http url to https? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ambari
			
    
	
		
		
		09-25-2017
	
		
		12:50 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 SmartSense looks great. If there is a free version or only paid one? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-15-2017
	
		
		12:03 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi. I am looking for some script which can help me configure my cluster based on cluster hardware. I know that there is a script hdp-configuration-utils (hdp-configuration-utils on GitHub) but it is old and I don't know if it does make any sense to use it.  My question is - is there any tool or script which can help configure HDP cluster? If hdp-configuration-utils is still good option for that kind of task? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hadoop
- 
						
							
		
			Apache Hive
			
    
	
		
		
		09-15-2017
	
		
		08:06 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Aravindan Vijayan   1. I see the same values, so it works good.  3. But when I added sizes of 3 disks from the first one I got 99880680 KB. The second one shows 95.25. What is the reason of the difference between them? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
        













