Member since 
    
	
		
		
		12-30-2015
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                164
            
            
                Posts
            
        
                29
            
            
                Kudos Received
            
        
                10
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 28753 | 01-07-2019 06:17 AM | |
| 1475 | 12-27-2018 07:28 AM | |
| 4439 | 11-26-2018 10:12 AM | |
| 1927 | 11-16-2018 12:15 PM | |
| 4162 | 10-22-2018 09:31 AM | 
			
    
	
		
		
		09-13-2022
	
		
		10:56 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello @kunal_agarwal   If you are using Knox Gateway, it may be the bug, presented here  To fix it you could apply this changes  to file:  ${KNOX_GATEWAY_HOME}/data/services/yarnui/2.7.0/rewrite.xml  to rewrite rules of yarnui service 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-02-2022
	
		
		08:15 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							   Spark and Hive use separate catalogs to access SparkSQL or Hive tables in HDP 3.0 and later. The Spark catalog contains a table created by Spark. The Hive catalog contains a table created by Hive. By default, standard Spark APIs access tables in the Spark catalog. To access tables in the hive catalog, we have to edit the metastore.catalog.default property in hive-site.xml (Set that property value to 'hive' instead of 'spark').       Config File Path:  $SPARK_HOME/conf/hive-site.xml     Before change the config     <property>
    <name>metastore.catalog.default</name>
    <value>spark</value>
</property>         After change the config     <property>
    <name>metastore.catalog.default</name>
    <value>hive</value>
</property>        
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		12-24-2021
	
		
		06:40 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Try to clean your metadata with ./hbase-cleanup.sh --cleanAll command and restart your services.  If you get "Regionservers  are not expired. Exiting without cleaning hbase data" stop Hbase service before running the command. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-07-2021
	
		
		03:02 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Huge thanks. It works for me. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-17-2021
	
		
		08:52 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Narendra_, as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-07-2020
	
		
		04:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Its a service that's running, not a job. So avoid killing it.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-01-2020
	
		
		08:29 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Just to clarify, we had this same issue and resolved it by recursively setting the permissions correctly on the folders below /tmp/hive in HDFS (not the filesystem of your Hive Server) and then restarting the Hive services in Ambari.    Generally the folders below /tmp/hive/ required 700, apart from "_resultscache_" which needed 733. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		05-14-2020
	
		
		10:45 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Did you resolve the issue. what are the steps you follow. Help me with the steps 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		10-09-2019
	
		
		09:28 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 PFA the below error logs :  19/10/09 16:09:32 DEBUG ServletHandler: chain=org.apache.hadoop.security.authentication.server.AuthenticationFilter-418c020b->org.apache.spark.ui.JettyUtils$$anon$3-75e710b@986efce7==org.apache.spark.ui.JettyUtils$$anon$3,jsp=null,order=-1,inst=true  19/10/09 16:09:32 DEBUG ServletHandler: call filter org.apache.hadoop.security.authentication.server.AuthenticationFilter-418c020b  19/10/09 16:09:32 DEBUG AuthenticationFilter: Got token null from httpRequest http://ip-10-0-10.184. ************:18081/  19/10/09 16:09:32 DEBUG AuthenticationFilter: Request [http://ip-10-0-10-184.*****:18081/] triggering authentication. handler: class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler  19/10/09 16:09:32 DEBUG AuthenticationFilter: Authentication exception: java.lang.IllegalArgumentException  org.apache.hadoop.security.authentication.client.AuthenticationException: java.lang.IllegalArgumentException  at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:306)  at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:536)  at org.spark_project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)  at org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)  at org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)  at org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)  at org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)  at org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:493)  at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)  at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)  at org.spark_project.jetty.server.Server.handle(Server.java:539)  at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:333)  at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)  at org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)  at org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:108)  at org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)  at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)  at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)  at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)  at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)  at org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)  at java.lang.Thread.run(Thread.java:748)  Caused by: java.lang.IllegalArgumentException  at java.nio.Buffer.limit(Buffer.java:275)  at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:365)  at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:358)  at org.apache.hadoop.security.authentication.util.KerberosUtil.getTokenServerName(KerberosUtil.java:291)  at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:285)  ... 22 more  19/10/09 16:09:32 DEBUG GzipHttpOutputInterceptor: org.spark_project.jetty.server.handler.gzip.GzipHttpOutputInterceptor@17d4d832 exclude by status 403  19/10/09 16:09:32 DEBUG HttpChannel: sendResponse info=null content=HeapByteBuffer@26ea8849[p=0,l=365,c=32768,r=365]={<<<<html>\n<head>\n<me.../body>\n</html>\n>>>\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00...\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00} complete=true committing=true callback=Blocker@137652aa{null}  19/10/09 16:09:32 DEBUG HttpChannel: COMMIT for / on HttpChannelOverHttp@4d71d816{r=2,c=true,a=DISPATCHED,uri=//ip-10-0-10-184.******:18081/}  403 java.lang.IllegalArgumentException HTTP/1.1  Date: Wed, 09 Oct 2019 16:09:32 GMT  Set-Cookie: hadoop.auth=; HttpOnly  Cache-Control: must-revalidate,no-cache,no-store  Content-Type: text/html;charset=iso-8859-1    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		02-19-2019
	
		
		05:24 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Any Suggestions on how to do that..(complete noob here), looking at: https://community.hortonworks.com/articles/149486/llap-sizing-and-setup.html, but I think that is previous version of HDP (im using 3.0)..not seeing what exactly what I need to adjust? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		 
         
					
				













