Member since
09-20-2018
366
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3071 | 05-14-2019 10:47 AM |
10-11-2019
03:59 AM
Hi, Did you tried taking backup of the folder /var/lib/hadoop-yarn/yarn-nm-recovery/yarn-nm-state. and deleting all the contents under /var/lib/hadoop-yarn/yarn-nm-recovery/yarn-nm-state and restarting affected node manager? Please share the updates Thanks AKR
... View more
10-11-2019
03:09 AM
Hi, Are you still getting the same error even though after increasing the Overhead memory? Could you please share the Error messages after increasing the Overhead memory / Executor/ Driver memory? Thanks AK
... View more
10-09-2019
09:28 AM
PFA the below error logs : 19/10/09 16:09:32 DEBUG ServletHandler: chain=org.apache.hadoop.security.authentication.server.AuthenticationFilter-418c020b->org.apache.spark.ui.JettyUtils$$anon$3-75e710b@986efce7==org.apache.spark.ui.JettyUtils$$anon$3,jsp=null,order=-1,inst=true 19/10/09 16:09:32 DEBUG ServletHandler: call filter org.apache.hadoop.security.authentication.server.AuthenticationFilter-418c020b 19/10/09 16:09:32 DEBUG AuthenticationFilter: Got token null from httpRequest http://ip-10-0-10.184. ************:18081/ 19/10/09 16:09:32 DEBUG AuthenticationFilter: Request [http://ip-10-0-10-184.*****:18081/] triggering authentication. handler: class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler 19/10/09 16:09:32 DEBUG AuthenticationFilter: Authentication exception: java.lang.IllegalArgumentException org.apache.hadoop.security.authentication.client.AuthenticationException: java.lang.IllegalArgumentException at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:306) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:536) at org.spark_project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:493) at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213) at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.spark_project.jetty.server.Server.handle(Server.java:539) at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.IllegalArgumentException at java.nio.Buffer.limit(Buffer.java:275) at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:365) at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:358) at org.apache.hadoop.security.authentication.util.KerberosUtil.getTokenServerName(KerberosUtil.java:291) at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:285) ... 22 more 19/10/09 16:09:32 DEBUG GzipHttpOutputInterceptor: org.spark_project.jetty.server.handler.gzip.GzipHttpOutputInterceptor@17d4d832 exclude by status 403 19/10/09 16:09:32 DEBUG HttpChannel: sendResponse info=null content=HeapByteBuffer@26ea8849[p=0,l=365,c=32768,r=365]={<<<<html>\n<head>\n<me.../body>\n</html>\n>>>\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00...\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00} complete=true committing=true callback=Blocker@137652aa{null} 19/10/09 16:09:32 DEBUG HttpChannel: COMMIT for / on HttpChannelOverHttp@4d71d816{r=2,c=true,a=DISPATCHED,uri=//ip-10-0-10-184.******:18081/} 403 java.lang.IllegalArgumentException HTTP/1.1 Date: Wed, 09 Oct 2019 16:09:32 GMT Set-Cookie: hadoop.auth=; HttpOnly Cache-Control: must-revalidate,no-cache,no-store Content-Type: text/html;charset=iso-8859-1
... View more
09-08-2019
04:28 PM
@ilia987, The message "Driver grid-05.test.com:36315 disassociated! Shutting down" sounds like AM had trouble getting back to Driver, can you share below info: - did you run spark in cluster or client mode? - what is the full command? - what's the error from client side where you ran spark-submit? - what's the error in yarn side? As suggested by @AKR to share the entire application logs Cheers Eric
... View more
09-08-2019
06:58 AM
Hi, Can you share the error message from spark history server logs when the spark history UI is crashing? Thanks AKR
... View more
09-05-2019
08:37 AM
Hi, Exit Code 143 happens due to multiple reasons and one of them is related to Memory/GC issues. Your default Mapper/reducer memory setting may not be sufficient to run the large data set. Thus, try setting up higher AM, MAP and REDUCER memory when a large yarn job is invoked. For more please refer to this link. https://stackoverflow.com/questions/42972908/container-killed-by-the-applicationmaster-exit-code-is-... Thanks AKR
... View more
09-05-2019
08:34 AM
Hi, Exit code 143 is related to Memory/GC issues. Your default Mapper/reducer memory setting may not be sufficient to run the large data set. Thus, try setting up higher AM, MAP and REDUCER memory when a large yarn job is invoked. For more please refer to this link. https://stackoverflow.com/questions/42972908/container-killed-by-the-applicationmaster-exit-code-is-143 Thanks AKR
... View more
09-04-2019
10:31 AM
Hi, Check for the total no of applications in the Application history path, if the total no of files is more try to increase the heap size and look whether it works. Alternatively look for the spark history server logs too for any errors. Thanks AKR
... View more
09-03-2019
11:19 AM
Hi, We need to review the Resource manager logs to look for the Errors if any,. Also we need to view the Resource manager webUI to check for the resource utilization and Memory utiilization in queue wise on the jobs submitted. Thanks AKR
... View more
09-03-2019
11:09 AM
Hi, To view the spark logs of an completed application you can view the logs by running the below command yarn logs -applicationId application_xxxxxxxxxxxxx_yyyyyy -appOwner <userowner> > application_xxxxxxxxxxxxx_yyyyyy.log Thanks AKR
... View more