STARTUP_MSG: build = git@github.com:hortonworks/hadoop.git -r 2820e4d6fc7ec31ac42187083ed5933c823e9784; compiled by 'jenkins' on 2018-09-19T10:19Z STARTUP_MSG: java = 1.8.0_112 ************************************************************/ 2019-07-22 12:44:58,402 INFO datanode.DataNode (LogAdapter.java:info(51)) - registered UNIX signal handlers for [TERM, HUP, INT] 2019-07-22 12:44:58,840 INFO checker.ThrottledAsyncChecker (ThrottledAsyncChecker.java:schedule(137)) - Scheduling a check for [DISK]file:/hadoop/hdfs/data 2019-07-22 12:44:58,851 INFO checker.ThrottledAsyncChecker (ThrottledAsyncChecker.java:schedule(137)) - Scheduling a check for [DISK]file:/data/hadoop/hdfs/data 2019-07-22 12:44:58,852 INFO checker.ThrottledAsyncChecker (ThrottledAsyncChecker.java:schedule(137)) - Scheduling a check for [DISK]file:/var/hadoop/hdfs/data 2019-07-22 12:44:58,959 INFO impl.MetricsConfig (MetricsConfig.java:loadFirst(118)) - Loaded properties from hadoop-metrics2.properties 2019-07-22 12:44:59,328 INFO timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(85)) - Initializing Timeline metrics sink. 2019-07-22 12:44:59,329 INFO timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(105)) - Identified hostname = datanode05, serviceName = datanode 2019-07-22 12:44:59,377 INFO availability.MetricSinkWriteShardHostnameHashingStrategy (MetricSinkWriteShardHostnameHashingStrategy.java:findCollectorShard(42)) - Calculated collector shard datanode04 based on hostname: datanode05 2019-07-22 12:44:59,378 INFO timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(130)) - Collector Uri: http://datanode04:6188/ws/v1/timeline/metrics 2019-07-22 12:44:59,378 INFO timeline.HadoopTimelineMetricsSink (HadoopTimelineMetricsSink.java:init(131)) - Container Metrics Uri: http://datanode04:6188/ws/v1/timeline/containermetrics 2019-07-22 12:44:59,385 INFO impl.MetricsSinkAdapter (MetricsSinkAdapter.java:start(204)) - Sink timeline started 2019-07-22 12:44:59,441 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(374)) - Scheduled Metric snapshot period at 10 second(s). 2019-07-22 12:44:59,441 INFO impl.MetricsSystemImpl (MetricsSystemImpl.java:start(191)) - DataNode metrics system started 2019-07-22 12:44:59,672 INFO common.Util (Util.java:isDiskStatsEnabled(395)) - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2019-07-22 12:44:59,675 INFO datanode.BlockScanner (BlockScanner.java:(184)) - Initialized block scanner with targetBytesPerSec 1048576 2019-07-22 12:44:59,681 INFO datanode.DataNode (DataNode.java:(486)) - File descriptor passing is enabled. 2019-07-22 12:44:59,681 INFO datanode.DataNode (DataNode.java:(499)) - Configured hostname is datanode05 2019-07-22 12:44:59,682 INFO common.Util (Util.java:isDiskStatsEnabled(395)) - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2019-07-22 12:44:59,685 INFO datanode.DataNode (DataNode.java:startDataNode(1399)) - Starting DataNode with maxLockedMemory = 0 2019-07-22 12:44:59,706 INFO datanode.DataNode (DataNode.java:initDataXceiver(1147)) - Opened streaming server at /0.0.0.0:50010 2019-07-22 12:44:59,708 INFO datanode.DataNode (DataXceiverServer.java:(78)) - Balancing bandwidth is 6250000 bytes/s 2019-07-22 12:44:59,708 INFO datanode.DataNode (DataXceiverServer.java:(79)) - Number threads for balancing is 50 2019-07-22 12:44:59,711 INFO datanode.DataNode (DataXceiverServer.java:(78)) - Balancing bandwidth is 6250000 bytes/s 2019-07-22 12:44:59,711 INFO datanode.DataNode (DataXceiverServer.java:(79)) - Number threads for balancing is 50 2019-07-22 12:44:59,711 INFO datanode.DataNode (DataNode.java:initDataXceiver(1165)) - Listening on UNIX domain socket: /var/lib/hadoop-hdfs/dn_socket 2019-07-22 12:44:59,751 INFO util.log (Log.java:initialized(192)) - Logging initialized @1916ms 2019-07-22 12:44:59,842 INFO server.AuthenticationFilter (AuthenticationFilter.java:constructSecretProvider(240)) - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2019-07-22 12:44:59,845 INFO http.HttpRequestLog (HttpRequestLog.java:getRequestLog(81)) - Http request log for http.requests.datanode is not defined 2019-07-22 12:44:59,851 INFO http.HttpServer2 (HttpServer2.java:addGlobalFilter(968)) - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2019-07-22 12:44:59,854 INFO http.HttpServer2 (HttpServer2.java:addFilter(941)) - Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context datanode 2019-07-22 12:44:59,854 INFO http.HttpServer2 (HttpServer2.java:addFilter(951)) - Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context logs 2019-07-22 12:44:59,854 INFO http.HttpServer2 (HttpServer2.java:addFilter(951)) - Added filter authentication (class=org.apache.hadoop.security.authentication.server.AuthenticationFilter) to context static 2019-07-22 12:44:59,855 INFO security.HttpCrossOriginFilterInitializer (HttpCrossOriginFilterInitializer.java:initFilter(49)) - CORS filter not enabled. Please set hadoop.http.cross-origin.enabled to 'true' to enable it 2019-07-22 12:44:59,877 INFO http.HttpServer2 (HttpServer2.java:bindListener(1185)) - Jetty bound to port 35953 2019-07-22 12:44:59,879 INFO server.Server (Server.java:doStart(346)) - jetty-9.3.19.v20170502 2019-07-22 12:44:59,912 INFO server.AuthenticationFilter (AuthenticationFilter.java:constructSecretProvider(240)) - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2019-07-22 12:44:59,914 INFO handler.ContextHandler (ContextHandler.java:doStart(781)) - Started o.e.j.s.ServletContextHandler@2a79d4b1{/logs,file:///var/log/hadoop/hdfs/,AVAILABLE} 2019-07-22 12:44:59,915 INFO handler.ContextHandler (ContextHandler.java:doStart(781)) - Started o.e.j.s.ServletContextHandler@17cdf2d0{/static,file:///usr/hdp/3.0.1.0-187/hadoop-hdfs/webapps/static/,AVAILABLE} 2019-07-22 12:44:59,927 WARN webapp.WebAppContext (WebAppContext.java:doStart(531)) - Failed startup of context o.e.j.w.WebAppContext@cb191ca{/,null,null}{/datanode} java.lang.IllegalStateException: Temp dir /tmp/jetty-localhost-35953-datanode-_-any-4263088992112773049.dir not useable: writeable=false, dir=true at org.eclipse.jetty.webapp.WebInfConfiguration.configureTempDirectory(WebInfConfiguration.java:388) at org.eclipse.jetty.webapp.WebInfConfiguration.makeTempDirectory(WebInfConfiguration.java:360) at org.eclipse.jetty.webapp.WebInfConfiguration.resolveTempDirectory(WebInfConfiguration.java:308) at org.eclipse.jetty.webapp.WebInfConfiguration.preConfigure(WebInfConfiguration.java:69) at org.eclipse.jetty.webapp.WebAppContext.preConfigure(WebAppContext.java:485) at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:521) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:113) at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131) at org.eclipse.jetty.server.Server.start(Server.java:422) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:105) at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61) at org.eclipse.jetty.server.Server.doStart(Server.java:389) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68) at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1134) at org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.(DatanodeHttpServer.java:141) at org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:957) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1417) at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:500) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2782) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2690) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2732) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2876) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2900) 2019-07-22 12:44:59,935 INFO server.AbstractConnector (AbstractConnector.java:doStart(278)) - Started ServerConnector@53812a9b{HTTP/1.1,[http/1.1]}{localhost:35953} 2019-07-22 12:44:59,935 INFO server.Server (Server.java:doStart(414)) - Started @2100ms 2019-07-22 12:44:59,939 INFO server.AbstractConnector (AbstractConnector.java:doStop(318)) - Stopped ServerConnector@53812a9b{HTTP/1.1,[http/1.1]}{localhost:0} 2019-07-22 12:44:59,939 INFO handler.ContextHandler (ContextHandler.java:doStop(910)) - Stopped o.e.j.w.WebAppContext@cb191ca{/,null,UNAVAILABLE}{/datanode} 2019-07-22 12:44:59,944 INFO datanode.DataNode (DataNode.java:shutdown(2134)) - Shutdown complete. 2019-07-22 12:44:59,944 ERROR datanode.DataNode (DataNode.java:secureMain(2883)) - Exception in secureMain java.io.IOException: Problem starting http server at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1165) at org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.(DatanodeHttpServer.java:141) at org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:957) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1417) at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java:500) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2782) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2690) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2732) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2876) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2900) Caused by: java.lang.NullPointerException at org.eclipse.jetty.util.IO.delete(IO.java:344) at org.eclipse.jetty.webapp.WebInfConfiguration.deconfigure(WebInfConfiguration.java:195) at org.eclipse.jetty.webapp.WebAppContext.stopContext(WebAppContext.java:1380) at org.eclipse.jetty.server.handler.ContextHandler.doStop(ContextHandler.java:880) at org.eclipse.jetty.servlet.ServletContextHandler.doStop(ServletContextHandler.java:272) at org.eclipse.jetty.webapp.WebAppContext.doStop(WebAppContext.java:546) at org.eclipse.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89) at org.eclipse.jetty.util.component.ContainerLifeCycle.stop(ContainerLifeCycle.java:142) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStop(ContainerLifeCycle.java:160) at org.eclipse.jetty.server.handler.AbstractHandler.doStop(AbstractHandler.java:73) at org.eclipse.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89) at org.eclipse.jetty.util.component.ContainerLifeCycle.stop(ContainerLifeCycle.java:142) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStop(ContainerLifeCycle.java:160) at org.eclipse.jetty.server.handler.AbstractHandler.doStop(AbstractHandler.java:73) at org.eclipse.jetty.server.Server.doStop(Server.java:493) at org.eclipse.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89) at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1155) ... 9 more 2019-07-22 12:44:59,946 INFO util.ExitUtil (ExitUtil.java:terminate(210)) - Exiting with status 1: java.io.IOException: Problem starting http server 2019-07-22 12:44:59,949 INFO datanode.DataNode (LogAdapter.java:info(51)) - SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at datanode05/10.49.194.175 ************************************************************/