Member since 
    
	
		
		
		01-04-2018
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                44
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		01-10-2018
	
		
		02:04 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Rupinder Singh   Can you please elaborate the exact solution to this problem ? I am facing the same issue.. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-10-2018
	
		
		09:31 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Jay Kumar SenSharma   I tried Hive 2.0 as well, but again, the screen hangs up. After creation of a temporary table, it shows that it inserting the rows from the temporary table into the actual table. But after that, it just gets stuck. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-10-2018
	
		
		07:35 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Jay Kumar SenSharma   @Aditya Sirna   I downloaded the sandbox for docker from hortonworks.com again and then re-installed it. All the errors were gone.   The missing files were found. But now, I am again stuck at this upload table thing. The upload table is not showing any error. But the ambari UI hangs up after hitting the upload table button in Hive View. I'm attaching the screenshot :      There is no progress after this popup of "Upload Progress"  Please have a look and suggest me how to get around with this. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-10-2018
	
		
		01:14 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Jay Kumar SenSharma   How to check that ? Should I use the command "sandbox version" in the sandbox terminal ? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-10-2018
	
		
		12:59 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Jay Kumar SenSharma   I have tried to create a new Sandbox image in docker several times before and similar errors have always cropped up. I feel that there is something wrong with the Hortonoworks Sandbox for docker that has been uploaded on the official website. 😞   I downloaded Hortonworks Sandbox for docker from here : https://hortonworks.com/downloads/# 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-10-2018
	
		
		12:35 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Aditya Sirna   I tried running the first three commands using :  {username} as admin  {password} as admin  {ambari-host} as sandbox.hortonworks.com  {port} with ambari port(default 8080)   {clustername} as Sandbox  {hostname} as sandbox.hortonworks.com      E.g. I ran the following command :   curl -k -u admin:admin -H "X-Requested-By:ambari" -i -X PUT -d '{"HostRoles": {"state": "INSTALLED"}}' http://sandbox.hortonworks.com:8080/api/v1/clusters/Sandbox/hosts/sandbox.hortonworks.com/host_components/HDFS_CLIENT  But I got the following error :   HTTP/1.1 404 Not Found                                                                                                                               
X-Frame-Options: DENY                                                                                                                                
X-XSS-Protection: 1; mode=block                                                                                                                      
X-Content-Type-Options: nosniff                                                                                                                      
Cache-Control: no-store                                                                                                                              
Pragma: no-cache                                                                                                                                     
Set-Cookie: AMBARISESSIONID=14jxcimd3368rh42fh6lsxsnx;Path=/;HttpOnly                                                                                
Expires: Thu, 01 Jan 1970 00:00:00 GMT                                                                                                               
User: admin                                                                                                                                          
Content-Type: text/plain                                                                                                                             
Content-Length: 279                                                                                                                                  
{                                                                                                                                                    
  "status" : 404,                                                                                                                                    
  "message" : "org.apache.ambari.server.controller.spi.NoSuchParentResourceException: Parent Host resource doesn't exist.  Host not found, cluster=Sa
ndbox, hostname=sandbox.hortonworks.com.  Host not found, cluster=Sandbox, hostname=sandbox.hortonworks.com"  
  What to do now ? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-10-2018
	
		
		12:30 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 @Jay Kumar SenSharma   I was unable to restart the HDFS services successfully. There was some error.  Also, the folder "/etc/hadoop/conf" is missing.  Also, in the install clients option suggested by you, I went through that and it installed only the HCat client. I think the rest of the clients were pre-installed.  What to do now ? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-09-2018
	
		
		05:33 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							@Aditya Sirna In the folder '/usr/hdp/current/hadoop-client/conf/', I found the following files :  capacity-scheduler.xml      
hadoop-env.sh               
health_check      
mapred-site.xml      
secure                 
yarn-site.xml
commons-logging.properties  
hadoop-metrics2.properties  
log4j.properties  
ranger-security.xml  
task-log4j.properties
  The file core-site.xml is missing.  Also, the folder '/etc/hadoop/conf' is missing.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		01-09-2018
	
		
		02:31 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							     @Aditya Sirna   @Jay Kumar SenSharma   I was trying to upload a database table in Hive View. But after choose the file and make the relevant settings, and then when I hit the UPLOAD TABLE Option, I get the following error :  java.sql.SQLException:Errorwhile processing statement: FAILED:ExecutionError,return code 1fromorg.apache.hadoop.hive.ql.exec.DDLTask.MetaException(message:java.security.AccessControlException:Permission denied:user=hive, path="file:/":root:root:drwxr-xr-x)  I was denied permission to upload the table. I am using the account of maria_dev which is there by default in Ambari UI.  I tried changing the permission and try again by using the following command :  hdfs dfs -chmod 777/  On running this command in hortonworks sandbox terminal, I got the following error :  Exception in thread "main" java.lang.RuntimeException: core-site.xml not found                                                                       
        at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2640)                                                                
        at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2566)                                                               
        at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2451)                                                                    
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1164)                                                                         
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1136)                                                                         
        at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1472)                                                                  
        at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:321)                                          
        at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:487)                                            
        at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)                                                         
        at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)                                                         
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                                                                                 
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)                                                                                 
        at org.apache.hadoop.fs.FsShell.main(FsShell.java:356) 
  I used to get a similar error earlier too.   I was uploading the table from my laptop's desktop, i.e. Local File System. I didn't use any Hive Query to upload the table. In the Hive View of Ambari UI, there is an option of "UPLOAD TABLE". I clicked on that option, then I set the Field Delimiter as Tab delimited and then clicked on "Upload Table". After this, I got the error that I mentioned.  Can somebody help me to sort out this error ?         
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Ambari
- 
						
							
		
			Apache Hadoop
- 
						
							
		
			Apache Hive
			
    
	
		
		
		01-05-2018
	
		
		11:02 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I tried to set the permission for hive by using the following command :  # hdfs dfs -chmod 777/  But however I got the following error :  Exception in thread "main" java.lang.RuntimeException: core-site.xml not found                                                                       
        at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2640)                                                                
        at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2566)                                                               
        at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2451)                                                                    
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1164)                                                                         
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1136)                                                                         
        at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1472)                                                                  
        at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:321)                                          
        at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:487)                                            
        at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)                                                         
        at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)                                                         
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)                                                                                 
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)                                                                                 
        at org.apache.hadoop.fs.FsShell.main(FsShell.java:356)   
  I got this same error when I typed the following commands :  sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse  hdfs dfs -ls  But I am unable to figure out a solution. Can someone please help me out ? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hadoop
 
        







